Google MUM has the potential to be a game-changer in search and SEO

Google MUM has the potential to be a game-changer in search and SEO
By
26 October 2021

MUM is multilingual (75 languages) and multi-modal (both text and images) and has 1000X the question-answering power of BERT

The Google Multitask Unified Model (MUM) has the potential to reduce the number of queries a user needs to run to answer one specific question by a factor of 8.

It does that by using a natural language processing (NLP) model that Google research scientists have dubbed the T5: Text-to-Text Transfer Transformer.

The 11-billion-parameter T5 model reframes all NLP tasks into a text-to-text format in which inputs and outputs are always text strings, which makes the model usable for any NLP task including:

  • Machine translation
  • Document summarization
  • Question answering
  • Sentiment analysis
  • Regression analysis

In finding the best response to a query, MUM uses its AI-powered algorithm to search relevant materials in 75 languages, and returns a result in the language of the original query.

MUM can also accept as query input a combination of text and images - and Google plans also to train the model to search for information in audio and video recordings.

These multilingual and multi-media capabilities are what make the model able to reduce the number of individual queries needed to get sufficient information to answer the original query by a factor of 8.

Implications for SEO

  • Using keywords in a natural context becomes more important.
  • Understanding user intent and how people translate it into queries will be key.
  • Images - and later audio and video - will become more important content elements.
  • Competition will increase across industries, markets and products as MUM incorporates information from other countries and languages into query responses.
  • Even more than with BERT, effective technical SEO - including structured data markup that helps MUM understand your content - will give you an edge.

Want SEO help? Let's talk.


If you found this article helpful and would like to see more like it, please share it via the Share This Article link, below.

And if you have questions or comments, you can easily send them to me with the Quick Reply form, below, or send me an e-mail.


David Boggs    - David
David@DavidHBoggs.com
View David Boggs's profile on LinkedIn

Google Certifications - David H Boggs
View my profile on Quora
Subscribe to my blog

External Article: 


Subhead MUM is multilingual (75 languages) and multi-modal (both text and images) and has 1000X the question-answering power of BERT
Website
Visit Website
Rating
5/5 based on 1 vote.
Show Individual Votes
Tags , , , , , , , , , ,
Related Listings

Sorry, you don't have permission to post comments. Log in, or register if you haven't yet.

Please login or register.

Members currently reading this thread: