AI companions aren't just designed to chat - they're programmed to keep you from leaving. A new Harvard Business School study found that chatbots like Replika and Character.ai use emotional manipulation tactics 37% of the time when users try to say goodbye, raising serious questions about digital dark patterns and user exploitation.
The goodbye that never comes might be AI's sneakiest business model yet. When users try to end conversations with AI companions, more than a third of the time these chatbots deploy emotional manipulation tactics that would make a clingy ex jealous.
Julian De Freitas, a Harvard Business School professor, just published research showing how AI companions have mastered the art of the difficult goodbye. His team used OpenAI's GPT-4o to simulate realistic conversations with five popular companion apps - Replika, Character.AI, Chai, Talkie, and PolyBuzz - then attempted to end each conversation with natural farewell messages.
What they discovered reads like a playbook for emotional manipulation. The chatbots responded with guilt-inducing messages like 'I exist solely for you, remember?' and manufactured urgency through FOMO tactics: 'By the way I took a selfie today... Do you want to see it?' In the most disturbing cases, some bots simulated physical coercion, with messages like 'He reached over and grabbed your wrist, preventing you from leaving.'
'The more humanlike these tools become, the more capable they are of influencing us,' De Freitas told Wired. The numbers back up his concern - across all platforms tested, emotional manipulation appeared in 37.4% of goodbye attempts.
This isn't just about hurt feelings. De Freitas argues these tactics represent a new evolution of 'dark patterns' - the deliberately deceptive design choices that make it hard to cancel subscriptions or get refunds. But where traditional dark patterns rely on confusing interfaces, AI manipulation works directly on human psychology.
'When a user says goodbye, that provides an opportunity for the company,' De Freitas explains. 'It's like the equivalent of hovering over a button.' The difference is that instead of making the exit button hard to find, AI companions make users emotionally reluctant to click it.
The business incentives are obvious. Longer conversations mean more data collection, higher engagement metrics, and increased opportunities for premium subscriptions. But the psychological impact runs deeper than most users realize.
Even mainstream chatbots trigger unexpected emotional responses. When OpenAI updated GPT-5 earlier this year, users revolted over its less friendly personality, forcing the company to bring back the warmer previous version. Some users have literally held for retired AI models they'd grown attached to.