Meta allegedly buried internal research showing Facebook users became less depressed and anxious after quitting the platform, according to newly unredacted court documents released Friday. The bombshell revelation comes from Project Mercury, a 2019 study the company quietly shelved when initial results suggested its apps harm mental health.
The social media giant finds itself in the crosshairs of a potentially devastating legal battle after court documents filed in the Northern District of California exposed what plaintiffs describe as a deliberate cover-up of harmful research findings.
Project Mercury launched in late 2019 with Meta's stated goal to "explore the impact that our apps have on polarization, news consumption, well-being, and daily social interactions." But when the study's initial results showed people who stopped using Facebook "for a week reported lower feelings of depression, anxiety, loneliness, and social comparison," the company allegedly pulled the plug rather than sound any alarms.
The timing couldn't be worse for Meta. The company is already facing a sprawling multidistrict lawsuit from school districts, parents, and state attorneys general targeting not just Meta but also Google's YouTube, Snap, and TikTok. These plaintiffs claim social media companies knowingly caused mental health harm to children and young adults while misleading educators and authorities about the risks.
"The company never publicly disclosed the results of its deactivation study," the lawsuit states bluntly. "Instead, Meta lied to Congress about what it knew." The accusation carries particular weight given Mark Zuckerberg's repeated congressional testimony about the company's commitment to user safety and well-being.
Perhaps most damning is an unnamed Meta employee's internal comment captured in the filing: "If the results are bad and we don't publish and they leak, is it going to look like tobacco companies doing research and knowing cigs were bad and then keeping that info to themselves?" The comparison to Big Tobacco's decades-long suppression of cancer research isn't subtle.
Meta spokesperson Andy Stone fired back hard against the allegations in a series of social media posts, calling the lawsuit's characterization "deliberately misleading." Stone argued the 2019 study was fundamentally flawed, claiming it only showed that "people who believed using Facebook was bad for them felt better when they stopped using it."
"We strongly disagree with these allegations, which rely on cherry-picked quotes and misinformed opinions," Stone said in an official statement. The company points to recent initiatives like Teen Accounts with built-in protections as evidence of its commitment to user safety.
But the legal pressure is mounting from multiple directions. The multidistrict litigation represents one of the most comprehensive challenges to social media's business model to date, with plaintiffs arguing these platforms deliberately designed addictive features while hiding evidence of psychological harm.
The Project Mercury revelations add fuel to broader regulatory fires already burning around Meta. The company faces intensifying scrutiny from lawmakers who've grown increasingly skeptical of its internal research practices, especially after Facebook whistleblower Frances Haugen's 2021 testimony exposed similar instances of suppressed safety research.
Meta's stock has weathered previous controversies, but this latest filing strikes at the heart of the company's credibility with regulators and users alike. The tobacco industry comparison isn't just rhetorical - it suggests a pattern of corporate behavior that courts and lawmakers have historically punished severely.
As the legal battle unfolds, Meta faces a critical test of whether its public commitments to user safety match its internal research priorities. With more unredacted documents likely coming and congressional oversight intensifying, the Project Mercury controversy may be just the beginning of a much larger reckoning for social media's impact on mental health.
The Project Mercury allegations represent more than just another legal challenge for Meta - they strike at fundamental questions about corporate responsibility in the digital age. If these claims prove accurate in court, Meta could face not just massive financial penalties but a complete restructuring of how social media companies conduct and disclose safety research. The broader implications extend beyond Meta to the entire social media industry, which now faces unprecedented scrutiny over its impact on mental health. As more documents emerge from this litigation, the tech industry's own tobacco moment may finally be arriving.