The House Energy and Commerce Committee just dropped a bombshell that could reshape the entire online safety landscape. Their new 19-bill package guts the most controversial part of the Kids Online Safety Act - the duty of care provision that would have held platforms legally responsible for teen mental health harms. It's Congress's biggest swing at tech regulation in years, but critics are calling it either a smart compromise or a complete cop-out.
The House Energy and Commerce Committee just rewrote the playbook on kids' online safety, and tech platforms are breathing a collective sigh of relief. The committee's new 19-bill package strips away the most contentious piece of the Kids Online Safety Act - the duty of care provision that had Meta, Google, and other platforms scrambling to figure out how they'd avoid lawsuits over teen depression and eating disorders. What emerged instead is something that might actually have a shot at becoming law.
The original KOSA, which sailed through the Senate 91-3 last year, would have made platforms legally responsible for mitigating a broad range of mental health harms tied to their services. Critics across the political spectrum warned this could create a censorship nightmare, potentially blocking resources that help teens dealing with the very issues the bill aimed to address. Now, the House version takes a scalpel to that concern, replacing the sweeping duty of care with specific requirements around four narrow categories of harm.
Platforms will now need "reasonable policies, practices, and procedures" to address threats of physical violence, sexual exploitation, drug sales, and financial fraud. The requirements scale with platform size and technical capabilities - a clear nod to smaller social media companies that argued they couldn't match Meta's content moderation armies. The bill also brings nonprofit platforms under its umbrella, a significant expansion that could affect everything from Discord servers to Reddit communities.
But KOSA isn't operating alone anymore. The package includes the App Store Accountability Act, which would force Apple and Google to verify users' ages before allowing app downloads and pass that information to developers. It's the federal version of legislation that's already passed in multiple states, creating a patchwork compliance nightmare that platforms have been dreading. The move puts the age verification burden on app stores rather than individual platforms - a strategic shift that could fundamentally change how we think about digital identity.






