According to The Verge, Meta allegedly gave accounts engaged in human sex trafficking 16 violations before suspending them, according to testimony from former head of safety Vaishnavi Jayakumar. The 17-strike policy was confirmed in internal documentation and described as “very high” by industry standards. The unredacted court filing also reveals Meta reportedly lacked specific reporting tools for child sexual abuse material on Instagram, with Jayakumar claiming she was told building such tools would be “too much work.” Additionally, the filing shows Meta rejected making teen accounts private by default in 2019 after growth teams found it would “likely smash engagement,” though the company eventually implemented this change last year. The documents are part of a massive lawsuit filed by school districts against Meta, TikTok, Google, and Snapchat alleging they contribute to a “mental health crisis” through “addictive and dangerous” platforms.
The engagement-first culture
Here’s the thing that really stands out in these allegations: it’s not just about one bad policy. It’s about a pattern where safety consistently took a backseat to engagement metrics. When researchers found hiding likes would make users “feel significantly less likely to feel worse about themselves,” Meta walked it back because it was “pretty negative to FB metrics.” When they considered making teen accounts private by default, the growth team killed it because it would hurt engagement. Even beauty filters that were “actively encouraging young girls into body dysmorphia” got reinstated because removing them might have “negative growth impact.”
Mounting legal troubles
Meta might have won its antitrust battle with the FTC, but this child safety lawsuit is part of a much bigger wave of regulatory pressure. Dozens of school districts, attorneys general, and parents are now involved, and these unredacted filings are pretty damning. The company’s position seems increasingly difficult to defend when internal documents show such clear trade-offs between user wellbeing and platform metrics. And let’s be honest – a 17-strike policy for sex trafficking? That’s hard to justify under any circumstances.
Meta’s defense
Meta spokesperson Andy Stone pushed back hard, calling the allegations “cherry-picked quotes and misinformed opinions” that present a “deliberately misleading picture.” He pointed to Teen Accounts with built-in protections and parental controls as evidence of the company’s commitment to safety. But here’s the question: if these allegations are truly misleading, why did it take until last year to make teen accounts private by default? And why did internal researchers have to fight for basic safety features that were repeatedly rejected for engagement reasons?
What this means for social media
This case isn’t just about Meta – it’s about the entire social media industry’s business model. When engagement metrics drive everything, safety becomes negotiable. The lawsuit against TikTok, Google, and Snapchat shows this is an industry-wide problem. Basically, we’re seeing the consequences of platforms built around keeping users scrolling at all costs. And as these court documents show, the costs can be devastatingly human.
