Meta's first consumer AR glasses with built-in displays are hitting shelves September 30 for $799, marking a crucial step toward Mark Zuckerberg's vision of glasses replacing smartphones. The real breakthrough isn't the small translucent screen - it's the neural wristband that reads electrical signals from your body to control the device through hand gestures.
Meta just crossed the AR threshold that matters most - getting displays into consumers' hands at scale. The company's $799 Ray-Ban Display glasses go on sale September 30, and CNBC's early hands-on reveals why this device represents a bigger leap than Meta's flashy Orion prototypes from last year's Connect event.
The real story isn't the small translucent screen tucked into the right lens - it's the fuzzy gray wristband that comes with it. This EMG sensor reads electrical signals from your body, letting you control the glasses through hand gestures. CNBC's Jonathan Vanian described feeling a small electric jolt when the band activated, "not as much of a shock as taking clothes out of the dryer, but noticeable."
That neural interface proves crucial because the display itself remains deliberately simple. Unlike Orion's complex 3D overlays that required a computing puck, these glasses prioritize practicality. The translucent screen handles basic functions - reading messages, photo previews, live captions during conversations, and Spotify controls. "It's more utility than entertainment," Vanian noted after testing the device at Meta's Menlo Park headquarters.
The gesture learning curve reveals how early we are in this transition. Vanian struggled initially with pinching motions to open the camera app, finding himself double-pinching like clicking a computer mouse. "I learned I have subpar pinching skills that lack the correct cadence and timing," he admitted. The experience felt surreal - continuously pinching fingers while people watched, reminiscent of the "Kids in the Hall" sketch about crushing heads.
But when the controls clicked, they felt surprisingly natural. Volume adjustment through thumb and finger rotation - mimicking turning an invisible stereo knob - delivered what Vanian called an "expectedly delightful experience." This suggests Meta's neural interface could eventually feel as intuitive as touchscreens do today.
The display positioning creates interesting cognitive challenges. Sitting just outside the center field of view, the translucent screen caused "some cognitive dissonance" as Vanian's eyes constantly switched focus between digital and physical elements. Icons appeared "a bit murky" when contrasted against real-world backgrounds, though the high-resolution screen handled live captions effectively in noisy environments.