Google Search’s new features: Multisearch to more visuals results to shopping
Google Search is changing with richer results–which will include more visuals, videos along with the expansion of the Multisearch feature. But for users, the main question will be on how these changes impact them. Here’s a quick look at what will change in the coming months when you search for anything on Google.
Multisearch feature on Google Lens, Multisearch Near Me
This was launched as a beta back in April and Google is now bringing this to English globally. It will be coming to 70 more languages in the coming months. The Google multisearch feature will let users rely on their phone’s camera to find an image using Google Lens and then add an additional text-based query. Google’s earlier blog post had revealed some examples on how the feature works. For example, you could use a screenshot of an orange coloured dress and add ‘green’ to find it another colour. Or they could take a photo of a particular plan and add “care instructions” as another query.
Google’s Multisearch feature and the ‘near me’ in use. (Image credit: Google)
Multisearch will get another addition as well called ‘near me’ , though this particular aspect will only roll out later this fall in the US in English. This will allow users to use multisearch for find items near them with the help of the Google Lens. For example, one could take a picture of a particular food item and add ‘near me’ query to know if and where they can find the dish close. Google notes in the blog post, that this “new way of searching will help you find and connect with local businesses, whether you’re looking to support your neighborhood shop, or just need something right now.”
Get ready for more visual results
This appears to be Google’s attempt to take on some of the challenges being posed TikTok and Instagram with more and more GenZ users relying on these platforms for search. Google now intends to get more visual with its mobile search results. For example when you search for topics such as cities or a particular or even a global landmark, you might end up with results which have more visuals, including the Web Stories, instead of plain text-based links.
Google says it will make it easier to “explore a subject highlighting the most relevant and helpful information, including content from creators on the open web.” So yes, an influencers travel video might also show up in the search results. For topics such as cities, users may see “visual stories and short videos from people who have visited, tips on how to explore the city, things to do, how to get there and other important aspects…” notes the blog post.Google will suggest questions even as you type them
Say you are just starting to type out a question in the Google Search box. Well in the coming months, Google will try and expand on your query showing others words that you can tap and add into your query. Google will its predictive powers to provide “relevant content straight away, before you’ve even finished typing.”
It will “provide keyword or topic options to help you craft your question,” notes the blog. For example, when planning a vacation in a country, Google could suggest options like ‘best cities for families in XYZ ’ or even ‘best cities for couples on honeymoon’.
“Using deep understanding of how people search, we’ll soon show you topics to help you go deeper or find a new direction on a subject. And you can add or remove topics when you want to zoom in and out,” explains the blog.
More importantly, the way the results will be displayed will change with a rich source of formats being visible from text to images to video. These changes will start showing in the coming months.
More topics as your scroll through results
Google also notes that it will show more related topics around a search query as users scroll through the results on their mobile. Says you search about particular city in a country, and as you scroll down you could see topics such as best restaurants to eat in that city or details about the cities beaches or other public places, etc. Google says it will show content beyond the original query. Again this is something launching in English in the US in the coming months.
The new updated digital menus on Google Search.
Better results for food, dishes
The visual heavy results are also being extended to Food queries. If you search for a particular dish and restaurants which serve it, the results will be more visually appealing. Google will showcase the restaurants close along with an image of the dish from their menu, all designed to look more appetising.
The blog post also notes that Google is expanding “coverage of digital menus, and making them more visually rich and reliable.” It will combine “menu information provided people and merchants, and found on restaurant websites that use open standards for data sharing,” to power these results. Google says the new food menus will showcase the most popular dishes at a restaurant and also have other dietary options such as vegetarian or vegan or even spicy as a filter.
The multisearch feature can also be used to identify new food items and where these are being sold.
Shopping will get more personal
Google also announced a host of shopping related features though these are coming to the US first. These include the ability to shop for sneakers in 3D, seeing other trending clothing or fashion related items (often displayed as a story style) and the ability to shop a complete fashion look. When someone searches with the word “shop” followed whatever item they are looking for, Google will show “visual feed of products, research tools and near inventory related to that product.” It is expanding the search shopping experience to all categories from electronics to beauty — and will add more regions on mobile. The experience is also coming soon to desktop).
Google is also adding more ‘dynamic filters for users when shopping via the search engine. These adapt and are based on real-time Search trends. In the blog post Google explains, “if you’re shopping for jeans, you might see filters for “wide leg” and “bootcut” because those are the popular denim styles right now — but those may change over time, depending on what’s trending.” The feature will be available in US, Japan and India, and will come to more regions soon.
Finally, the Discover tab on Google will showcase more styles and results related to shopping as well to inspire users.
Lens AR translation
Google says that the new Lens translation update, will ensure that when you point the Lens camera at a poster in another language, they will see translated text realically overlaid onto the pictures underneath. Once again, it is relying on “major advancements in machine learning,” to create this blended look where the translated text appears more natural instead of jarring over the original photo or image. we’re now able to blend translated text into complex images, so it looks and feels much more natural, instead of jarring.
Google says they are using “generative adversarial networks (also known as GAN models), which is what helps power the technology behind Magic Eraser on Pixel,” to achieve this effect. This experience is launching later this year.
Discussions and Forums
Results will show a section for “discussions and forums,” for US English results. So results from Reddi or Quora or other platforms could also show up in the results as a dedicated section.