Google MUM and The Future of Search

Google MUM and the future of search - Semrush webinar with iPullRank and Wix

Google search is complex. While Google will provide insights into the technology they use and clues about how the various algorithms work, there’s no obvious instruction manual that dictates exactly why something will rank above everything else for a given keyword. 

As Google develops tools to better understand the complexities of linguistics and search intent, SEOs try to find ways to ensure their content is the best in the eyes of the search engine.

The most recent algorithmic tool, Google’s Multitask Unified Model (MUM) equips search with the most complex machine learning capabilities to understand a user’s query. 

iPullRank founder Michael King sat down with Erika Varangouli, Head of International Brand at Semrush, and Crystal Carter, Head of SEO Communications at Wix, to talk about Google MUM. In this webinar, they discussed Google MUM and the dramatic changes that are coming. 

Google MUM isn’t new, but that’s the problem, marketers aren’t ready. 

Today, they’ll show you how to adjust your strategy so you’re aligned with this major update. 

https://ipullrank.wistia.com/medias/trh1oduovp
  • The impact MUM will have on your business [2:16]
  • What is Google MUM? [3:35]
  • How MUM helps Google to keep up  [6:00]
  • MUM’s impact on Bert; what this means for your brand [7:46]
  • Seeing MUM in action on a live SERP [10:00]
  • What can we expect from MUM short-term and long-term? [11:18]
  • Who will rank well with MUM? [17:33]
  • How can you improve rankings with MUM? [18:40]
  • How does MUM work? [23:07]
  • What questions can MUM answer? [30:03]
  • Are marketers and SEOs ready for MUM? [36:39]
  • What can developers do to integrate and align with MUM? [39:46]
  • Practical tips you can use to improve SERP performance post-MUM [43:37]

The impact MUM will have on your business [2:16]

MUM has the potential to drastically change what you see in the SERPS. Google’s MUM update was introduced in May 2021, but the implications of this update haven’t been fully felt yet. 

MUM is a paradigm shift changing how Google provides answers to searchers. 

When searchers enter their questions in Google search, they receive various answers. These answers are typically helpful, but they’re often not the right answer; Google aims to change that with MUM. 

What is Google MUM? [3:35]

Google’s Multitask Unified Model or MUM uses AI and natural language processing to answer complex searcher questions. It’s an algorithm update that refines how Google understands and presents information to you, the searcher. 

Crystal explains that “Google’s goal with search is to organize the world’s information and make it universally accessible and useful.”

Previous updates like Hummingbird, Rankbrain, and Bert were designed to help Google with specific tasks; MUM is a single, unified model that handles all indexing, retrieval, and ranking tasks. What makes MUM significant is the fact that it’s oriented around information in all formats across a variety of languages

Google’s MUM update isn’t focused on web pages. It’s focused on information in all its forms. Some of this information comes from apps, others from video, audio files, or Google Maps. MUM enables Google to properly vet the information they’re giving to searchers, ensuring that it’s accurate, relevant, useful, and reliable.

If we look at Google’s history of algorithm updates, we see that they’re rolling out more updates that are rapidly being applied to their index.

How MUM helps Google to keep up  [6:00]

In 2018 Google applied more than 3,000 updates. 

They applied more than 4,887 updates, and they ran more than 600K experiments and tests in 2020. That number continues to climb for a single reason. 

Google has to make these updates to keep up with searchers, how they’re using the web and how they access information. Google wants to make the information searchers receive highly accessible. They want to maximize relevance and usefulness, and they want information from the best possible sources. 

Meaning what exactly? 

Google’s search results have to account for:

  • where a person is searching from.
  • when they’re searching.
  • the time of day they’re searching.
  • how they’re searching. What type of device are they using? Mobile, tablet,  desktop? What type of browser? Voice search? Image search?
  • Google has to manage the who — who is searching? 
  • They have to factor in search and click history, interests, demographic, psychographic, and ethnographic data.

If Google wants to maintain its position as king of the hill, they have to keep up. 

They continue to make these updates to ensure that search results are accessible, high quality, and trustworthy.

What about BERT?

Doesn’t BERT already do that for Google search? 

MUM’s impact on BERT; what this means for your brand [7:46]

MUM runs alongside BERT, but it’s 1,000x more powerful. 

MUM processes natural language, but it doesn’t stop there. With MUM, Google can parse data from 75+ languages (each language relies on its own language model). MUM enables Google to process language internationally via text, images, audio, video, and various emerging media with the same kind of quality you expect as a searcher.

One key component of MUM is that it can take images from those formats, text, video, audio, and images to fill in the contextual gaps of the search.

MUM is intended to give Google a deep, multi-language, multi-media understanding of the information on the web. 

Seeing MUM in action on a live SERP [10:00]

Understanding the framework and purpose of MUM provides a helpful starting point, but what does it look like in action? 

Crystal demonstrates MUM’s approach using the query ‘Spaghetti Squash.’

When you make that simple query, Google tried to guess your intent for searching the term. The search includes a definition, pictures, substitutes for spaghetti squash, dietary info, recipes to use it in, etc. You’ll also see a variety of videos addressing different intents for spaghetti squash (all of the different ways to cook it).

What can we expect from MUM short-term and long-term? [11:18]

With the continued rollout of MUM, what can we anticipate for SERP changes? 

In the short term, Crystal believes there will be

  • Large scale MUM updates: More iterations and updates of the algorithm (even if they’re not foundational) with less aggressive corrections. 
  • Less priority on visible text: Better results for image-heavy verticals since MUM will develop a better understanding of photos and videos in the context of written text. 
  • More SERP diversity and rich results: The context relationships will allow Google to understand and predict intent more effectively and better suit the SERP results to include more result diversity and rich features that enhance the user experience.

In the long term, there’s a strong chance that:

  • High volatility in Medical/Health verticals: Specific verticals like healthcare might be significantly disrupted as Google invests in the space. 
  • Some reduction in prominence of English content: With the focus on multi-language understanding, other language content might be surfaced more frequently. 
  • More device/tool-driven updates: Tools like Google Lens allow for a more complete multi-media approach to search (as exhibited by the most recent multi-search update at the 2022 Google i/O summit.)

Who will rank well with MUM? [17:33]

Sites that are already ranking well for broad terms will continue to rank well. Google views sites like Wikipedia, WebMD, and others as highly authoritative. When they want to provide searchers with immediate answers to quick questions, they’ll continue to rely on these trustworthy sources. 

What about niche topics or subject areas? 

Google will use MUM to drill down into these specialized areas, putting multiple pieces of information together to generate a response. They’re going to pull relevant pieces of information from various sources and use this information to create entirely new answers. 

In Google’s example asking about climbing Mt. Fuji, MUM surfaces a variety of local, commercial, and informational content to go beyond the initial query:

“Since MUM can surface insights based on its deep knowledge of the world, it could highlight that while both mountains are roughly the same elevation, fall is the rainy season on Mt. Fuji so you might need a waterproof jacket. MUM could also surface helpful subtopics for deeper exploration — like the top-rated gear or best training exercises — with pointers to helpful articles, videos and images from across the web.”

Source: Google Search Blog

What does this mean for you? 

Will it still be possible for you to improve your rankings post-MUM?

How can you improve rankings with MUM? [18:40]

This requires a mindset shift. 

Remember, with MUM, Google uses these algorithms to generate new answers from previously existing information. It’s no longer about being in front; it’s about being a part of the conversation. 

What does this mean? 

With MUM, it’s no longer about presenting five to ten answers. It’s not even about featured snippets. Google doesn’t want to give you ten options or require searchers to refine their queries. Historically, that’s how it’s been done. You enter your query in Google, and they give you ten options (or more) you have to filter through yourself. 

They want to get away from that. 

They don’t want to give you the answer to a specific question; they want to give you your answer to your specific question. With MUM, Google aims to give you domain expertise and an answer specific to you and your circumstance. 

This is incredibly difficult. 

As a human being, it can be incredibly difficult to understand what another person may want or need, even if you know them. When searchers enter their query, are they having a bad day? Are they ashamed to ask for something specific? Do they even know what they want? What if they have a vague idea?

This is why this update is mind-blowing. 

We’re now at a stage where Google (via MUM) is attempting to cater to their searcher’s obvious, vague, and implied intents. These vary tremendously based on factors like location, personal search history, time of day, day of the week/month, and even how users are searching. 

The opportunity for brands is to be that ‘complete answer,’ so MUM doesn’t have to work so hard to find information from disparate sources.

How does MUM work? [23:07]

Google’s trying to build the Star Trek computer. 


Source: CBS

Way back in the 2010s, Google’s Search Head Amit Singhal announced that “The destiny of search is to become that ‘Star Trek’ computer, and that’s what we are building,” Singhal stated that Google would know what searchers want and users would no longer have to type their queries into a search box. 

With MUM, you’re seeing Google’s attempts at building an ambient computer. 

MUM will give Google the ability to answer very complex questions, providing searchers with an answer that’s specific to them. This requires domain expertise if they’re going to answer questions successfully for you. Bert relied on technologies like GPT-3; using these models and a big data set of pages across the web, they trained their algorithms. 

They extracted different words, and they built models to predict words. Based on that, they could write copy that reads naturally —searchers wouldn’t know if a person or computer system wrote the information they were consuming. 

MUM is 1,000x better than BERT. It’s built on the T5 model; this gives Google a deeper and richer understanding of the information they’re working with and the intent and context behind it. Using the T5 model, MUM allows Google to generate answers from various data points pulled across its index. 

Google is getting incredibly close to its ambient search engine.

This is how MUM will generate new, relevant, and precise answers to searcher queries. They’re less interested in giving you the answer and more interested in giving you your answer

In 2021, Google produced an opinion paper called Rethinking Search: Making Domain Experts out of Dilettantes, which discusses what the goals of search should be.

Google understands that users are seeking out domain expertise and that search engines historically have not fulfilled that need. They understand that users have to go on a complete search journey to answer complex questions. Instead, they want to use language models to pull pieces of the web together and generate domain expertise. 

Additionally, the multimodal aspect of MUM, as mentioned, pulls together different media like video, images, and text to create the best answer. Multimodal search is not new.

Not only does it try to answer the objective question, but it also tries to take into consideration your context. Where are you physically? Where are you geographically? What have you searched for in the past? All of these contextual pieces of information can inform your search intent and why information will best serve you (the individual that is searching).

Mike provides the example: “How far is the Empire State building?” For him, being based in Brooklyn, he can see it from his window, but that’s going to be a very different result for Crystal who lives across the Atlantic Ocean. If he then asks, “How long will it take me to get there?” Google knows the previous question and his location to provide the best result tailored to him.

If you’ve ever used an image for a query like the example below, then you’ve performed a multimodal search.

In this visualization, you can see how Google takes the written text query in conjunction with the image and identifies contextual information from both media to blend together the results with a combination of relevant visual, textual, and even video results.

Each component of the image is broken down into elements and converted to text to further contextualize the potential results.

So Google takes any information and breaks it down into elements, adapts it based on media format and additional context, adds the weighted scoring of quality of results, and delivers your answers via the SERP.

What questions can MUM answer? [30:03]

MUM is designed to answer complex questions that are relevant and specific to the searcher.

For a long time, Google has collected a vast warehouse of information specific to you; they’ve taken all of the information about where you’re searching from. They’ve used all that information to build that user context and inform personalization.

They’re collecting everything in their query logs, but a lot of information goes with those query logs —location, what you clicked on, etc. All of those things inform what they’ll show you in the future. When we talk about MUM, we’re talking about Google running their queries through a variety of different models simultaneously, depending on what you’ve given them.

This model is informed and inspired by the human brain. It doesn’t matter which senses you use to pull in information; you’re reconciling that information into one thing, whether that’s words, thoughts, images, etc. 

It’s essentially the same with Google. 

They’re saying, ‘we pull everything in, and then we turn it into words, and then there’s an output that is a series of words as well.’ This is how T5 itself works ( before we even think about MUM).

So what can MUM do? 

It can do translation. It can score a sentence for grammar. It can also come up with text summarization — this one model encapsulates a series of models. You throw in text, it gives you text back. Another interesting thing they identify with the multi-model concept is mixing models together; when they do that, they notice that it produces better responses. 

In 2017, Google released something called “MultiModel” that combined a series of neural networks across multiple tasks to improve understanding across those tasks.

The bottom line is MUM uses image recognition, speech recognition, translation, image capturing, and parsing, in general, to get to the bottom of whatever a searcher is trying to answer with their queries.

Want to learn more about the technical documentation for MUM? 

The MUM patent talks about how every input, no matter the source, is basically converted into a text-based query and then run through a series of machine learning models. The combined outputs are used to determine the answer.

Bonus: Keep your eyes peeled for the impact of another algorithm that Google introduced last year, MURAL which improves the ability to search for images in different languages despite only the original language being present. MURAL will complement MUM with multi-language extraction from images.

Are marketers and SEOs ready for MUM? [36:39]

If you’ve been producing 10x content in various formats for a while, you should be fairly ready for MUM. There’s a very rich emphasis on multimedia. Video and images will continue to effectively get your content surfaced.

Smaller businesses can utilize local tools like Google Business Profiles using Google Maps, Q&A, and posts to appear in SERPs for local MUM results.

MUM is going to work irrespective of what we do. 

Use more structured data and more specific content that reaches the specific context of your users.

But there isn’t a list of things you can do to optimize for MUM instantly; that’s not realistic. It’s really about serving user intent — the people we’re trying to reach, do we have content that meets their specific needs? This may require that you create more robust content or that you create different pieces of content that cover the same topic for a given audience. We’re going to find that MUM acts like a kind of filter to some degree. Remember, Google wants to answer your specific question rather than just handing out a broad or generic answer.

What can developers do to integrate and align with MUM? [39:46]

Talk to your developers about how you handle media. 

Make sure that you have a good CDN in place and that you’re producing quality content in various formats (e.g., text, images, decks, video, audio, etc.). When you’re thinking about the media you have on your website, make sure it’s highly accessible and that people can access it clearly.

These days, if you read a news article, they’ll often have a video or audio snippet of it as well. They’ll include images, audio, video, and text. They’ll allow people to access it using a variety of methods. This is good, not just for algorithms and bots but also for users who may have different requirements.

Practical tips you can use to improve SERP performance post-MUM [43:37]

In addition to accessibility, take the simple steps needed to improve your performance. 

  • Describe your images well
  • Surface your media content 
  • Make good use of transcripts 
  • Add structured markup (not just schema)
  • Add header tags, put your tables in actual tables (instead of divs)

In short, make your content easier for Google to extract and index. Use ranking tools to see whether you’re ranking for specific queries or local packs. Analyze your rankings for media (e.g., video, images, audio, etc.). If your text results are ranking well but you’re not ranking for video, divert some of your attention to video. 

If search queries are oriented around images, ensure your data ranks well for images. Take the time to look at the actual SERPs, as search results are always changing. 

Understand that measurement will be challenging. 

For example, Google doesn’t offer analytics on voice search. Take a look at the SERPs and use that, along with other data points, to draw conclusions. Create content that covers all of your bases topically across various formats (text, image, video, audio, mobile, apps, etc.). 

MUM is the future of search. 

Work to cover your bases now; while you can’t optimize for MUM directly, you can optimize your content around user/searcher intent. Work to provide searchers with what they want, where, when, and how they want it. Listen to their requests and work to cater to their needs as best you can, do that, and you’ll find the transition is straightforward. 

About Michael King, Founder and Managing Director at iPullRank

Michael is the Founder and Managing Director of iPullRank. He has an extensive background in software, web development, and creative. He’s an industry leader, and his work is frequently featured on sites like Moz, SearchEngineWatch, Unbounce, and Distilled. He’s a sought-after speaker speaking internationally at SMX, SearchLove, SES, MozCon, and LinkLove. 

About Erika Varangouli, Head of International Brand at Semrush

Erika drives growth across all channels at Semrush. She creates local and global content marketing strategies at Semrush. She started her career as a journalist, but gradually the world of SEO and content marketing won her over.


About Crystal Carter Head of SEO Communications at Wix

Crystal is a digital marketing professional with over 10 years of experience working with SEO and Marketing clients around the world. As Head of SEO communications at Wix, she identifies and implements tactics that help to optimize digital activity and drive sales, engagement, and growth. Her passion for research and mastering the latest digital tools help push clients to the forefront of the digital landscape. She has contributed to events and publications from Moz, Whitespark Local Search Summit, Semrush, SMX, Search Engine Land, DeepCrawl, Women in Tech SEO, and more.

Andrew McDermott

Leave a Comment

Your email address will not be published. Required fields are marked *

CHECK OUT OUR RECENT CONTENT

Get The Rank Report

iPullRank's weekly SEO, Content Strategy and Generative AI Newsletter

TIPS, ADVICE, AND EXCLUSIVE INSIGHTS DIRECT TO YOUR INBOX

Join over 4000+ Rank Climbers and get the SEO, Content, and AI industry news, updates, and best practices to level up your digital performance.

Considering AI Content?

AI generative content has gone mainstream.

Discover what that means for your business and why AI generation can be your competitive advantage in the world of content and SEO.