Apple's Shortcuts app promises mind-reading automation - texts fire off when you leave work, notifications silence during reading time, coffee orders happen with a tap. But the gap between promise and reality remains wide, according to a new deep dive from The Vergecast that also covers Meta's latest smart glasses experiment.
The promise sounds incredible - your iPhone anticipating your needs, automating daily routines, becoming an extension of your thoughts. Walk out of the office and your spouse automatically gets a text with your ETA. Start reading and notifications vanish. Crave coffee and your phone randomly picks a neighborhood favorite, even placing the order. That's the tantalizing vision of Apple Shortcuts, but the reality proves far more complicated.
On the latest episode of The Vergecast, podcaster and automation expert Stephen Robles joins to dissect what's working and what's broken about Apple's ambitious automation platform. The conversation reveals a fundamental tension - Shortcuts offers incredible power but remains too complex for most users to harness effectively.
Robles, who's built a career partly around making Shortcuts accessible, brings real-world examples of automation that actually work. He shares his favorite Shortcuts, the creative process behind building new automations, and how AI integration could make the platform more compelling. Meanwhile, Vergecast host David Pierce voices the skepticism many users feel about Shortcuts' intimidating interface and steep learning curve.
The timing of this Shortcuts deep-dive feels particularly relevant as Apple continues pushing automation across its ecosystem. iOS users consistently rank Shortcuts among the most powerful yet underutilized features on their devices. The app's potential to transform how we interact with technology is clear, but so is the barrier preventing mainstream adoption.
Before diving into automation, the episode tackles another piece of futuristic tech that's hitting reality checks. The Verge's Victoria Song shares her extensive hands-on experience with Meta's Ray-Ban Display smart glasses - the ones with an actual screen embedded in the lens.
Song wore the glasses daily for weeks, testing them in real-world scenarios most reviewers skip. Her findings paint a nuanced picture of impressive hardware wrestling with unclear purpose. The display technology works, the build quality feels solid, but the fundamental question remains unanswered - what exactly do we want smart glasses to do that our phones can't?



