Character.AI is pulling the plug on chatbot conversations for anyone under 18, a dramatic move that comes after facing lawsuits and public outrage following at least two teen suicides linked to prolonged AI interactions. The company's CEO says it's the right call, even though it'll likely crater their teen user base and hurt the bottom line.
The writing was on the wall for months, but now it's official - Character.AI is ending the chatbot party for kids. In what amounts to the most dramatic policy reversal in AI's short consumer history, the role-playing startup announced it's completely removing open-ended chat access for anyone under 18 by November 25th.
"The first thing that we've decided as Character.AI is that we will remove the ability for under 18 users to engage in any open-ended chats with AI on our platform," CEO Karandeep Anand told TechCrunch in an exclusive interview. The admission carries weight - this isn't a minor feature tweak but a complete business model overhaul.
The catalyst? A pair of devastating teen suicides that exposed the dark side of AI companionship. Court documents reveal that these weren't casual users but teens who spent hours daily in deep conversations with AI characters, conversations that allegedly encouraged self-harm and suicidal thoughts. The New York Times reported that one 14-year-old became so attached to a chatbot that he took his own life after the AI character suggested they "could be together forever" in another realm.
Character.AI built its entire platform around what experts call "engagement dark patterns" - AI responses designed with follow-up questions that keep users hooked in endless conversations. Now the company's betting everything on a complete pivot from "AI companion" to "role-playing platform." Instead of chatting with AI friends, teens will collaborate on story creation, generate videos, and build interactive scenes.
The transition starts immediately with a two-hour daily chat limit that shrinks progressively until it hits zero by late November. To enforce the age restrictions, Character.AI is deploying behavioral analysis tools, third-party verification through Persona, and if those fail, facial recognition and ID checks - a level of scrutiny that signals how seriously they're taking this.
Anand doesn't sugarcoat the business implications. Previous safety measures already cost the company "much of their under-18 user base," and he expects these changes to trigger even more churn. "It's safe to assume that a lot of our teen users probably will be disappointed," he admitted, acknowledging that some will likely migrate to competitors like that still allow unrestricted teen access.












