A bombshell court filing reveals Meta allowed accounts engaged in sex trafficking to violate policies 16 times before suspension, according to explosive testimony from former safety executive Vaishnavi Jayakumar. The unredacted documents paint a disturbing picture of a company that allegedly prioritized engagement over user safety, even when dealing with the most serious harms.
The testimony of Vaishnavi Jayakumar, Meta's former head of safety and wellbeing, reads like a corporate safety nightmare. According to unredacted court documents, accounts trafficking humans for sex could rack up 16 violations before facing suspension. "That means that you could incur 16 violations for prostitution and sexual solicitation, and upon the 17th violation, your account would be suspended," Jayakumar testified during her deposition. She called this "a very high strike threshold" by "any measure across the industry."
The revelations don't stop there. The filing alleges Meta didn't even have a specific mechanism for Instagram users to report child sexual abuse material (CSAM). When Jayakumar discovered this glaring gap and "raised this issue 'multiple times,'" she was reportedly told building such a system would be "too much work." The casual dismissal of child safety concerns suggests a company culture where operational convenience trumped user protection.
These damning details emerged as part of a massive multi-district lawsuit targeting Meta, TikTok, Google, and Snapchat. School districts, attorneys general, and parents across the country allege these platforms are fueling a "mental health crisis" through "addictive and dangerous" design choices. The timing couldn't be worse for Meta, which recently celebrated winning its antitrust battle against the Federal Trade Commission, only to face this fresh wave of safety-related scrutiny.
