Technology

Google Search gets slew of new features, more focus on visuals

Google Search is getting a slew of new features, the company announced at its ‘Search On’ event, and many of these will ensure richer and more visually focused results. “We’re going far beyond the search box to create search experiences that work more like our minds– that are as multi-dimensional as people. As we enter this new era of search, you’ll be able to find exactly what you’re looking for combining images, sounds, text and speech. We call this making Search more natural and intuitive,” Prabhakar Raghavan, Google SVP of Search said during the keynote.
First, Google is expanding the multisearch feature–which it introduced in beta in April this year– to English globally and it will come to 70 more languages over the next few months. The multisearch feature let users search for multiple things at the same time, combining both images and text. The feature can be used along with Google Lens as well. According to Google, users rely on its Lens feature nearly eight billion times a month to search for what they see.
But combining Lens with multisearch users will be able to take a picture of an item and then use the phrase ‘near me’ to find it near. Google says this “new way of searching will help users find and connect with local businesses.” The “Multisearch near me” will start rolling out in English in the US later this fall.
Google Search shopping when used with the multisearch feature. (Image: Google)
“This is made possible an in-depth understanding of local places and product inventory. informed the millions of images and reviews on the web,” Raghavan said regarding multisearch and Lens.
Google is improving how translations will show over an image. According to the company, people use Google to translate text on images over 1 billion times per month, across more than 100 languages. With the new feature, Google will be able to “blend translated text into complex images, so it looks and feels much more natural.” So the translated text will look more seamless and a part of the original image, instead of the translated text standing out. According to Google, it is using “generative adversarial networks (also known as GAN models), which is what helps power the technology behind Magic Eraser on Pixel,” to ensure this experience. This feature will roll out later this year.
It is also making improvements to its iOS app where users will be able to shortcuts right under the search bar. This will help users shop using their screenshots, translate any text with their camera, find a song and more.
Food results in the revamped Google Search.
Google Search’s results will also get more visually rich when users are browsing for information about a place or topic. In the example Google showed, when searching for a city in Mexico, the results also display videos, images and other information about the place at hand all in the first set of results itself. Google says this will ensure a user does not have to open multiple tabs when trying to get more information about a place or a topic.
In the coming month, it will also provide more relevant information, even as a user begins to type in a question. Google will provide ‘keyword or topic options to help” users craft their questions. It will also showcase content from creators on the open web for some of these topics such as cities, etc, along with travel tips etc. The “most relevant content, from a variety of sources, no matter what format the information comes in — whether that’s text, images or video,” will be shown, notes the company’s blog post. The new feature will be rolled out in the coming months.
When it comes to searching for food– and this could be a particular dish or an item at a restaurant, Google will show visually richer results, including photos of the dish in question. It is also expanding “coverage of digital menus, and making them more visually rich and reliable.”
According to the company, it is combining “menu information provided people and merchants, and found on restaurant websites that use open standards for data sharing,” and relying on its “image and language understanding technologies, including the Multitask Unified Model,” to power these new results.
“These menus will showcase the most popular dishes and helpfully call out different dietary options, starting with vegetarian and vegan,” Google said in a blog post.
It will also tweak how shopping results appear on Search making them more visual along with links, as well as letting them shop for a ‘complete look’. The search results will also support 3D shopping for sneakers where users will be able to view these particular items in 3D view.
Google Maps
Google Maps is also getting some new features which will more visual information, though most of these will be limited to select cities. For one, users will be able to check the ‘Neighbourhood vibe’ meaning figure out the places to eat, the places to visit, etc, in a particular locality.
This will appeal to tours who will be able to use the information to know a drict better. Google says it is using “AI with local knowledge from Google Maps users” to give this information. Neighbourhood vibe starts rolling out globally in the coming months on Android and iOS.
It is also expanding the immersive view feature to let users see 250 photorealic aerial views of global landmarks that span everything from the Tokyo Tower to the Acropolis. According to Google’s blog post, it is using “predictive modelling,” and that’s how immersive view automatically learns horical trends for a place. The immersive view will roll out in the coming months in Los Angeles, London, New York, San Francisco and Tokyo on Android and iOS.
Users will also be able to see helpful information with the Live View feature. The search with Live View feature helps users find a place around them, say a market or a store while they are walking around. Search with Live View will be made available in London, Los Angeles, New York, San Francisco, Paris and Tokyo in the coming months on Android and iOS.
It is also expanding its eco-friendly routing feature– which launched earlier in the US, Canada, and Europe–to third-party developers through Google Maps Platform. Google is hoping that companies in other industries such as delivery or ridesharing services — will have the option to enable eco-friendly routing in their apps and measure fuel consumption.

Related Articles

Back to top button