Google just flipped the script on visual search with a major AI Mode update that lets users shop by describing items the way they'd talk to a friend. Instead of wrestling with filters and keywords, shoppers can now ask for "barrel jeans that aren't too baggy" and follow up with "show me acid-washed denim" - and the AI actually gets it. The feature starts rolling out to US users this week, powered by Gemini 2.5's advanced multimodal capabilities.
Google just made online shopping feel more like chatting with your most stylish friend. The company's latest AI Mode update transforms visual search from a clunky keyword exercise into natural conversation, letting users describe what they want in plain English and get back results that actually match their vibe. The rollout starts this week for US users, marking a significant shift in how people discover products online.
"Starting today, you can ask a question conversationally and get a range of visual results in AI Mode, with the ability to continuously refine your search in the way that's most natural for you," Google announced in its blog post. The company promises users will "see rich visuals that match the vibe you're looking for, and can follow up in whatever way is most natural for you, like asking for more options with dark tones and bold prints."
The conversational approach feels intuitive compared to traditional e-commerce filtering. Instead of checking boxes for "blue," "size medium," and "cotton blend," shoppers can simply ask for "barrel jeans that aren't too baggy" and then naturally refine with "I want more ankle length" or "show me acid-washed denim." Google's AI will "intelligently provide a relevant set of shoppable options," making it easy to jump directly to retailer sites for purchases.
This update builds heavily on Google's existing visual search infrastructure, combining Google Search with Lens and Image search capabilities. But the real breakthrough comes from integrating Gemini 2.5's advanced multimodal and language capabilities, which allows AI Mode to recognize subtle details and secondary objects in images. That means the system can understand visual context in ways previous search tools couldn't match.
The shopping angle represents just one piece of Google's broader visual search strategy. Users can also upload reference images or snap photos to find visually similar results, mixing images with natural language descriptions to narrow down exactly what they're after. The conversational search capabilities extend beyond retail too, working for general visual exploration like hunting down interior design inspiration or identifying architectural styles.