Amazon just rolled out Lens Live, an AI-powered visual search upgrade that instantly identifies products through your camera and matches them against billions of items in real-time. The launch integrates Rufus, Amazon's shopping assistant, directly into the camera view, marking the company's boldest push yet into computer vision commerce. Available to tens of millions of iOS users starting today, this signals Amazon's intention to own the visual discovery moment that increasingly drives modern shopping behavior.
Amazon just dropped a visual search bombshell that could reshape how millions discover products. Lens Live transforms the company's existing camera-based shopping tool into a real-time AI assistant that instantly recognizes products and surfaces matches from Amazon's catalog the moment you point your phone at any item.
The timing couldn't be more strategic. As social commerce explodes and visual discovery becomes the dominant shopping trigger, Amazon is betting big on owning that crucial moment when customers spot something they want. "When you spot an item you love on social media or while out and about, Amazon Lens is the quickest way to find similar items," according to Amazon's announcement.
What sets Lens Live apart is speed and intelligence. The moment customers open the camera, products start appearing in a swipeable carousel at the bottom of the screen. No more taking photos and waiting for results – the AI processes what you're seeing in real-time, matching against billions of Amazon products using what the company describes as "deep learning visual embedding models."
But here's where it gets interesting: Amazon embedded Rufus, its conversational shopping assistant, directly into the camera experience. Users now see suggested questions and product summaries without leaving the viewfinder, turning casual browsing into informed purchasing decisions. "These conversational prompts and summaries appear under the product carousel, allowing customers to perform speedy research," Amazon explains in technical documentation.
The technical architecture reveals Amazon's serious AI infrastructure investments. Lens Live runs on AWS-managed Amazon OpenSearch and services, with lightweight computer vision models running directly on users' devices. This hybrid approach – on-device processing for speed, cloud processing for matching – suggests Amazon learned from early visual search failures that frustrated users with slow response times.