On Thursday, Google announced new search-related updates. In addition to the broad rollout of the redesigned AI Overviews feature, which makes links to publications more prominent, along with — whomp, whomp — ads on AI Overviews, Google touted some entirely new features.
Here’s what’s happening with Lens, Circle to Search, and Google’s entirely new experience for searching for recipes
1. Video and voice search with Lens
The Lens tool that’s handy for searching visual elements can now process videos and voice. It works in the Google app by pointing the camera at something that’s best understood with a moving image (like a school of fish swimming in a circle, as Google demonstrated in the demo). Hold down the shutter to record the video and ask Lens a question, like “why are they swimming together?” Lens will process the video and your question to give you a response in the form of an AI Overview, plus relevant resources to click on.
Mashable Light Speed
Google explains why AI Overviews immediately got weird
Lens for video is currently only available for those with access to Google’s testing ground, Search Labs. Lens for photos with voice prompts is live for Android and iOS users through the Google app.
Lens now let users ask questions about videos with voice prompts.
Credit: Google
2. Circle to Search for music
Circle to Search is another visual search tool that debuted earlier this year. Android users can circle or scribble over an object in an image and look something up on the page without having to switch apps. But now they can also search for a song in a video on social media or a streaming app in the same manner. Obviously, you can’t physically circle a song, so instead, you click on the musical note icon next to the search bar at the bottom of the page. Google’s AI model will then process the audio and pull up information about the song. Circle to Search for music is available to Android users with compatible Google Pixel and Samsung Galaxy devices.
3. AI-organized search results
Google’s search results are going to look a little different. With recipes and meal inspiration for starters, open-ended queries will show search results that have been organized by Google’s AI models. Starting this week in the U.S. on mobile, users will start to see the full page of results broken down in subcategories of the initial query. The feature combines Google’s Gemini AI models with its core Search algorithm to surface relevant topics based on a request for, say, vegetarian appetizer ideas. AI-organized search results are available on the mobile Google app.
Google will now use its AI models to surface top results related to your search.
Credit: Google
Topics
Artificial Intelligence
Google