The wearable tech industry is having an identity crisis. When The Verge's Victoria Song asked Google what to call their new Project Aura glasses, it sparked a heated debate about terminology that reveals deeper confusion across Silicon Valley about how to position these AI-powered face computers.
A simple question at Google's Project Aura demo last week exposed the tech industry's biggest identity crisis since tablets tried to kill laptops. Victoria Song from The Verge asked what to call these new glasses-shaped computers, and suddenly everyone had opinions about the future of wearable terminology.
The answer? Nobody really knows.
Meta fired the first shot earlier this year when a communications representative asked journalists to refer to Ray-Ban Meta glasses as 'AI glasses' instead of smart glasses. The strategic rebrand makes sense - it distances these devices from Google Glass's failed legacy while positioning artificial intelligence, not augmented reality, as the killer feature. CEO Mark Zuckerberg and CTO Andrew Bosworth have consistently framed the glasses as the perfect AI delivery vehicle.
But Google is playing by different rules. Juston Payne, Google's director of product management for XR, defines AI glasses as stylish, lightweight devices that may or may not have displays, with AI integral to the experience. Project Aura doesn't qualify under this definition - Google officially calls it 'wired XR glasses' because of its tethered battery pack.
The confusion deepens when you talk to hardware partners. Xreal CEO Chi Xu, whose company collaborated with Google on Project Aura, simply laughed when asked about categorization. 'We will call all our glasses and previous products AR glasses,' he told Song.
This isn't just semantic confusion - it reflects a fundamental industry shift. Research firms can't even agree on basic definitions. Gartner defines smart glasses as camera- and display-free devices with Bluetooth and AI - essentially glorified headphones. Counterpoint Research focuses on 'smart glasses without see-through displays' as the primary market driver. IDC takes the broadest approach, including anything glasses-shaped.
The old categories are breaking down. We used to have clear divisions between virtual reality (immersive, cut off from the world) and augmented reality (digital overlays on reality). Then mixed reality and extended reality entered the conversation, blurring lines further. Form factor used to predict function - headsets meant VR, glasses meant AR. Not anymore.
Today's headsets increasingly blend virtual and real worlds, while glasses-shaped devices serve vastly different purposes. The isn't called the Galaxy MR, even though mixed reality better describes its capabilities. True AR, with its sci-fi promise of digital overlays, remains mostly theoretical.












