Meta just expanded its Teen Accounts feature globally across Facebook and Messenger, bringing enhanced safety controls to millions of young users worldwide. The move comes as fresh research suggests teens still face risks on Instagram despite existing protections, intensifying scrutiny over social media's impact on youth mental health.
Meta is doubling down on teen safety across its platform empire. The company announced Thursday that Teen Accounts - the restrictive experience initially limited to Instagram - now covers Facebook and Messenger users globally, marking the biggest expansion of youth protections since the program launched last fall.
The rollout doesn't come by accident. Meta and other social giants were grilled by U.S. lawmakers earlier this year for failing to adequately protect teens from online harm. The congressional hearings exposed how platforms struggled to balance user engagement with youth safety, leading to intense regulatory pressure.
Under the expanded Teen Accounts system, Facebook and Messenger automatically place users under 18 into a locked-down experience designed to limit inappropriate content and unwanted contact. Teens under 16 need parental permission to modify any settings - a significant departure from Meta's traditionally permissive approach.
The restrictions run deep. Teens can only receive messages from people they follow or have previously contacted. Story interactions are limited to friends only, while tags, mentions, and comments face similar restrictions. The platform also enforces daily usage limits with hourly reminders and automatically enrolls teens in "Quiet mode" during overnight hours.
But the timing reveals Meta's defensive positioning. Research led by a Meta whistleblower recently found that children and teens remain at risk from online harm on Instagram, even with Teen Accounts active. The study documented teens encountering suicide and self-harm content alongside posts describing demeaning sexual acts - exactly the type of content the protections were designed to block.
Meta has disputed the claims, arguing its safety measures have successfully reduced harmful content exposure among teen users. The company insists the protections are working as intended, though the whistleblower research suggests implementation gaps remain.
Alongside the global Teen Accounts expansion, Meta is launching its School Partnership Program across all U.S. middle and high schools. The initiative allows educators to report safety concerns like bullying directly to Instagram for expedited review and removal. Schools participating in the program receive priority reporting channels and educational resources, plus a banner identifying them as official Instagram partners.