How Google is Making Search More Multidimensional
Abby is PMG’s senior managing editor, where she leads the company’s editorial program and manages the PMG Blog and Insights Hub. As a writer, editor, and marketing communications strategist with nearly a decade of experience, Abby's work in showcasing PMG’s unique expertise through POVs, research reports, and thought leadership regularly informs business strategy and media investments for some of the most iconic brands in the world. Named among the AAF Dallas 32 Under 32, her expertise in advertising, media strategy, and consumer trends has been featured in Ad Age, Business Insider, and Digiday.
Google recently hosted its annual ‘Search On’ event, with executives and product managers taking the stage to share the latest updates coming soon to Google Maps, Search, and Shopping. Advancements in multisearch, personalization capabilities, and 3D shopping experiences are just a few of the product updates Google is making to transform search from the traditional text box into a more helpful, intuitive experience.
By combining images, speech, text, and sounds, Google Search is entering a new era—one where online search is more comprehensive and intuitive than ever before.
Google is tapping into computer vision, predictive modeling, real-time data, and machine learning advancements to power new features like “Shop the Look,” “Multisearch near me,” and the neighborhood vibe check on Google Maps to create a more immersive search experience for its users.
Updates to make Google more intuitive, imaginative, and immersive come as the company faces increasing competition from rival platforms. While the use of TikTok, Reddit, and Amazon as search engines is growing, Google maintains its dominance as the leading search engine, with these new features helping to defend its market position while evolving the platform to meet consumers’ ever-changing needs.
Google is on a mission to improve the lives of as many people as possible, and with that, Google Search is evolving to become as multidimensional as people are. Combining images, speech, text, and sounds, Google Search is entering a new era—one where online search is more comprehensive and intuitive than ever. With a deep understanding of how people search, Google is using computer vision, predictive modeling, real-time data, and machine learning to build and power a more immersive search experience for users.
Updates to the Google Search experience include:
Multisearch, which was released earlier this year in the U.S. and allows users to search with images and text at the same time, is rolling out to over 70 languages in the next few months, helping users around the world explore topics more naturally by combining images, text, and sound. “Multisearch near me” will roll out in English markets later this fall, helping users find things instantly by connecting them with local businesses nearby.
Advancements in machine learning will allow Google Lens to translate text across different languages faster than ever, breaking down language barriers around the world.
Search bar shortcuts were introduced to the Google app for iOS to enable greater personalization at scale.
Improvements to predictive search, coupled with a new, more visual layout and search experience, will help users explore topics on Google more naturally by combining articles, images, videos, and other relevant information all in one place.
At the event, Google touted that updates to the search experience were designed to better reflect how people search for information, find inspiration, and shop online, helping users find what they’re looking for, dig deeper into a topic, or discover a new direction, all from a singular Google search. “A product like multisearch gives search queries more context than just words or an image by empowering the user to shape the results in a more nuanced, helpful way,” said Jason Hartley, head of search, shopping, and social at PMG. “As is frequently the case, what’s truly exciting about these features is what’s happening underneath the technology. In this case, it’s clear that Google is making huge advancements in artificial intelligence and machine learning, and we can expect to see even more products move us further away from the ‘type and click’ model and into a truly immersive experience and more human approach to searching for information.”
Google Maps is being reimagined to “look and feel more like the real world,” starting with a more visual and intuitive map that evolves from its current 2D form into a rich, multidimensional experience. In the months to come, Google will release a neighborhood vibe feature, updates to Live View, and more, all thanks to advancements in computer vision and predictive modeling.
Updates to Google Maps include:
Find out what’s worth checking out in the neighborhood: Combining artificial intelligence and local knowledge shared by Google Maps users in photos and reviews, “neighborhood vibe check” will inform users about local gems, what’s new, and what destinations are worth exploring in any particular neighborhood.
Live View breakthroughs: Search with Live View is coming soon, giving users the ability to navigate the world more easily using overlay arrows and directions and markers for popular destinations like coffee shops, stores, and transit stations.
New immersive aerial views: Using predictive modeling, Google Maps is introducing aerial views for 250 global landmarks and providing a more “multi-dimensional view of an area with critical information like the weather, traffic, and busyness layered on top.” By leaning on historical trends, Google Maps will soon power a more immersive view of what a place will look like “tomorrow, next week, or even next month.”
These features will begin rolling out globally on Google Maps on Android and iOS in the coming months.
At Google’s Search On event, several product developments and new features were revealed specifically for Google Shopping to help power a more “immersive, informed, and personalized” shopping experience. A recurring theme across announcements was how the Shopping Graph, Google’s AI-enhanced model that’s informed by more than 35 billion product listings, was used to inform design decisions in addition to powering the layout of multimedia results that will be pulled in from across Search, Shopping, and Maps.
One of the most innovative features is “Shop the Look,” a new tool that will allow shoppers to seamlessly assemble an outfit—with products sourced from across retailers—directly on Google. Shop the Look capabilities supplement Google’s new “shop” feature, where shoppers can simply append “shop” to the beginning of their search query and retrieve a visual feed of products, nearby inventory, and inspiration pulled from popular websites, giving a more immersive look at the products, content, and nearby locations that match what they are shopping for. In addition to these features, the shoppable search experience has expanded beyond apparel to include all categories, including electronics and beauty. By offering a shoppable display of products from a wide range of retailers and brands and fitted with personalization features and product filters, Google Shopping is now more like digital window shopping, emulating the same experience of shopping at a brick-and-mortar location.
“Much of what we saw at the Google Search On event is in response to larger consumer trends we see taking shape across the industry as Google evolves the search experience to stay relevant,” said Hartley. “I’m excited to see that Google is thinking not only of cosmetic changes to the SERPs, though design is critically important, but also bringing the world outside of Google into the search experience with features like multisearch and Shop the Look.”
Updates to make Google more intuitive, imaginative, and immersive come as the company faces increasing competition from platforms like Amazon, Reddit, and TikTok. Insider Intelligence suggests that as much as 60 percent of shoppers first visit Amazon when looking for a product, while long-tail queries that include “Reddit” on Google are only growing as people bypass Google search results and seek crowd-sourced answers and information on other platforms instead. Last month, The New York Times reported that TikTok has become the new go-to search engine for Gen Z consumers, as more young people turn to TikTok, rather than Google or other platforms, to search for ideas, inspiration, and information. At a conference earlier this year, Google senior vice president Prabhakar Raghavan outlined that in internal Google studies, almost 40 percent of young people go to Instagram or TikTok when they’re looking for a place for lunch, rather than Google.
Stay in touch
Subscribe to our newsletter
While the popularity of other platforms is growing, Google maintains its dominance as the leading search engine, with these product updates helping to defend its market position while evolving the platform to meet consumers’ ever-changing needs.