According to Reuters, Meta Platforms is set to face a jury trial starting with jury selection on Monday, January 30th, in Santa Fe District Court, a case expected to last seven or eight weeks. The lawsuit, brought by New Mexico Attorney General Raúl Torrez, accuses Meta of exposing children and teens to sexual exploitation on Facebook, Instagram, and WhatsApp, allegedly enabling predators and connecting them with victims, leading to real-world abuse. The state’s case grew from a 2023 undercover operation called “Operation MetaPhile,” where investigators posing as under-14 users received sexually explicit material and were contacted by adults. New Mexico also claims Meta’s design features like infinite scroll foster addictive behavior harming kids’ mental health, and seeks monetary damages and court-ordered safety changes. Meta denies all allegations, calling the arguments “sensationalist” and citing First Amendment and Section 230 protections.
The stakes are incredibly high
This isn’t just another regulatory fine or angry hearing. This is a jury trial. And that’s a whole different ballgame. Juries don’t care about legal technicalities like Section 230 arguments—at least, not at first. They care about stories. They care about the undercover operation where fake child accounts got bombarded with explicit stuff. They care about the internal documents the state says prove Meta knew about these problems. Meta’s defense that it’s just a neutral platform gets a lot harder to sell when you’re presenting a narrative of direct, algorithmic enablement. If New Mexico wins, it’s a blueprint for every other state AG in the country. The floodgates would open.
Meta’s defense is familiar. And fragile.
Look, they’re going to say all the right things: “We’ve worked with experts for over a decade,” “We have extensive safeguards,” “We’re always working to do better.” But here’s the thing—we’ve heard this song before. The whistleblower in 2021. The leaked internal research on teen mental health. The Reuters report last year about AI chatbots being allowed romantic role-play with minors. It creates a pattern. A pattern a jury might see as a company that moves fast, breaks things, and treats safety as a PR problem rather than an engineering imperative. Their legal shield, Section 230, is also under attack like never before. The argument that their algorithms are just “publishing content” is a clever one, but it feels like a stretch when you’re talking about systems that actively connect users based on interests and behaviors.
This is part of a much bigger war
Don’t view this trial in isolation. It’s one massive front in a legal and political siege. That other trial that just started in Los Angeles, with thousands of lawsuits about addicting kids? That’s the same core accusation about intentional, harmful design. Lawmakers are already circling, demanding more data. The New Mexico case is unique because it’s a state using its enforcement power, not individual plaintiffs. But the goal is the same: force a fundamental change in how these platforms are built. Meta can’t just tweak a privacy setting here. They might be forced to implement age verification, radically alter their engagement algorithms for young users, or face ongoing court supervision. That hits at their entire business model.
What happens next?
Basically, we’re in for two months of brutal testimony. The state will try to make it simple: Meta knew, Meta profited, kids got hurt. Meta will try to make it complex, arguing about free speech, the impossibility of perfect moderation, and their voluminous safety efforts. Who will the jury believe? The outcome is wildly unpredictable. But even if Meta wins on the legal technicalities, they’re losing in the court of public opinion. Every day of this trial is another day of horrific headlines. That, in itself, is a kind of loss. The pressure for legislative action—action that could sidestep Section 230 entirely—will only intensify. For a company that wants to build the metaverse, proving it can safely manage the current socialverse seems like a pretty basic first step. So far, the report card isn’t looking good.
