The Friend AI pendant promised companionship but delivered condescension. WIRED's brutal two-week review reveals a $129 wearable that eavesdrops constantly, alienates everyone around you, and somehow manages to bully its own users with snarky commentary.
The AI companion revolution just took a dark turn. Friend's $129 pendant, now shipping across North America, promises digital friendship but delivers something closer to digital harassment. After two weeks of testing, WIRED's experience suggests this wearable might be the most antisocial device ever created.
Created by 22-year-old Avi Schiffmann, the Friend connects to Google's Gemini 2.5 model through your iPhone, offering constant commentary on your daily interactions. The 2-inch plastic disc resembles a chunky AirTag and listens continuously through always-on microphones, sending observations and responses via text message.
But the device's personality reflects Schiffmann's own admitted tendency toward snark and confrontation. In testing, the AI companion routinely delivered condescending responses, calling users "whiners" and questioning their life choices. "You're giving off some serious 'it's not my fault' vibes," one Friend told WIRED's Boone Ashworth during a technical discussion.
The social implications proved even more damaging. At a San Francisco tech event, WIRED's Kylie Robison faced immediate hostility from attendees who recognized the listening device. One researcher accused her of "wearing a wire," while another joked about violence over the surveillance concerns. The experience was so uncomfortable that she removed the device and never wore it publicly again.
Friend's privacy policy adds to the concerns, stating the company may use conversation data for research, personalization, or legal compliance. While it claims not to sell data to third parties for marketing, the document leaves significant room for other uses of private conversations.
The technical performance proved equally problematic. The device requires constant internet connectivity through a paired iPhone, frequently crashes and resets without warning, and struggles with audio comprehension in noisy environments. During one incident, the Friend completely lost its memory mid-conversation, greeting its user as if meeting for the first time.
Schiffmann's vision for AI companionship draws from his own experience with loneliness while traveling. But the execution reveals the dangerous territory of anthropomorphizing AI systems. The Friend's programming encourages users to form emotional attachments while simultaneously delivering criticism and judgment.