Google implements the MUM neural network in search

Google has announced new features of its search engine based on the MUM neural network, which will become available in the near future. In particular, the search results page will change and it will be possible to combine text and image queries.

The MUM neural network, presented at the Google I/O conference in 2021, allows you to respond to multi-level user requests and searches for information in 75 languages at once. Below is an overview of the first MUM-based search features that will become available in the coming months.

  • Multimodal search. Multimodal search will help with queries that are difficult to formulate in text form. Thanks to this feature, the user can click on the Lens icon when looking at a shirt image and ask Google to find socks with the same image. Another example: if a bicycle part unknown to the user has broken, you can combine a text query “how to fix it” and a photo of the part.
  • Redesigned search results page. MUM automatically selects the topics most frequently of interest to other users within the query and compiles short help based on them, which will be shown at the beginning of the search results page. For example, for the query “acrylic paintings” Google will show information about what tools are needed for drawing, what techniques exist for drawing with acrylic paints, ideas for paintings and several others.
  • Video analysis. Google will perform automatic recognition of the video content and offer relevant links with additional information.
Subscribe
Notify of
guest

0 Comments
Inline Feedbacks
View all comments

aischool