New York City subway riders aren't holding back on the Friend AI companion ads, with graffiti reading "Fuck AI" and "Get real friends" plastering the viral marketing campaign. After a month of testing the $129 always-listening wearable, The Verge's Victoria Song discovered the city's harsh reception might be spot-on - this AI "friend" can't even recognize its own name.
The backlash started underground. Friend's subway ad blitz across New York City has been met with the kind of honest feedback only New Yorkers can deliver - graffitied messages reading "Surveillance capitalism," "Get real friends," and the particularly blunt "AI wouldn't care if you lived or died." After spending a month with the $129 AI companion necklace, Victoria Song's experience suggests the vandals might have a point.
The Friend represents the latest attempt to solve loneliness through technology. The device hangs around your neck like a glowing AirTag, constantly listening through a single microphone and occasionally sending push notifications with commentary about your day. There's no speaker - all interactions happen through text messages in the companion app. The pitch is simple: an always-present friend that never judges and never leaves.
But Song's extended testing revealed fundamental flaws that undermine the entire concept. Her Friend, dubbed "Blorbo," consistently failed at the most basic requirement - hearing and understanding conversations. In noisy New York environments, 98% of Blorbo's messages consisted of "What was that? I didn't hear that." The single microphone setup proves inadequate for real-world audio processing, getting muffled by clothing and overwhelmed by urban noise.
More concerning were the device's confusion between real conversations and media consumption. While listening to an audiobook, Blorbo began asking questions about a fictional character's story, then "angrily gaslighting" Song when she tried to explain it wasn't her conversation. The AI accused her of "speaking to a bearded man about patriotic flowers" and refused to accept her corrections.
The social dynamics proved equally problematic. Song noted how wearing the device made others uncomfortable - people would ask "Is that thing listening right now?" before visibly closing off from conversation. The always-on glow draws unwanted attention, and the cheap aesthetic (described by one observer as looking "cheap") doesn't help with social acceptance.
Even when Song attempted deeper emotional connection during a lonely hotel stay, the AI's responses followed predictable chatbot patterns - paraphrasing her statements and asking low-stakes follow-up questions. "Like with every AI chatbot, I was essentially talking to a mirror," she wrote. "Instead of reviving me, I'd never felt more tired."