According to The Wall Street Journal, a landmark bellwether trial has begun in Los Angeles County Superior Court, spearheaded by a 19-year-old plaintiff known as KGM. She alleges that addictive use of platforms like Instagram, Facebook, YouTube, and TikTok caused her depression, anxiety, and body dysmorphia, with her lawsuit stating her mental health worsened the more she used the apps. This case is just one of thousands of similar personal injury lawsuits filed against social media companies. Meanwhile, Australia’s new law banning social media for users under 16 has already led to the deactivation of nearly five million accounts. The trial’s outcome is seen as critical for setting a precedent, even as companies like Snap and TikTok have already settled with this particular plaintiff.
The Causation Problem
Here’s the thing: proving this in court is a nightmare. The WSJ podcast hosts nail the core issue. Sure, there’s plenty of observational data linking the rise of social media to a decline in teen mental health, especially for girls. Jonathan Haidt’s book The Anxious Generation lays that out. But a courtroom isn’t about population-level correlations. It’s about proving, for one specific person, that social media caused their illness—and not the other way around, or that it wasn’t a dozen other life factors.
And in KGM’s case, it gets messy fast. She experienced domestic abuse as a toddler and was in therapy for it. That’s a massive, known risk factor for later mental health struggles. So how do you disentangle that trauma from the impact of scrolling through Instagram? You’d need a randomized controlled trial you can’t ethically run. So the plaintiffs are left trying to convince a jury with shaky observational studies and internal company emails. It’s a brutally high bar to clear.
The Section 230 Hurdle
But even if they clear the science hurdle, there’s a legal mountain called Section 230. This federal law basically gives internet platforms immunity for content posted by their users. The companies argue this protects them. If a user posts harmful dieting content, the platform isn’t liable for that third-party speech.
So the lawsuits are trying an end-run. They’re framing it as a product liability case. The argument is that the design—features like infinite scroll, autoplay, and notification algorithms—is negligently and addictively engineered. It’s not about the content; it’s about the slot-machine-like delivery system. But is that distinction clear enough? As the WSJ notes, infinite scroll is everywhere online. If a news site uses it, are they liable if someone reads too many depressing articles? This legal theory is untested, and it feels like the entire appeal is already being written, no matter what this trial jury decides.
The Big Tobacco Comparison That Falls Short
Everyone wants this to be the next Big Tobacco litigation. I get the appeal. You have damning internal emails (like a Meta researcher calling IG “a drug,” even if they claimed sarcasm). You have a product that might be broadly harmful. But the comparison is weak where it matters most: causation. Smoking causes measurable, physical damage in virtually everyone who does it enough. Social media use does not cause measurable psychological harm in every teen. Plenty use it and are fine.
So what are you really suing over? A dangerous product? Or a product that can exacerbate pre-existing vulnerabilities in some users? That’s a much fuzzier, more difficult legal target. It gets into questions of personal responsibility, parenting, and individual psychology that a class-action lawsuit against cigarette makers never had to deal with.
So What Happens Now?
This trial is a canary in the coal mine. If the plaintiff wins, it opens the floodgates for the thousands of other cases, regardless of the inevitable Section 230 appeals. It would be a seismic shift. But I think a loss is more likely, precisely because the causation and legal immunity issues are so thorny.
Look, the real action might already be elsewhere. Australia’s ban deactivating five million accounts shows that legislative action—however blunt—is moving faster than the courts. States in the U.S. are pushing their own age restriction laws. The public and political pressure is building for some kind of change, whether it’s through litigation, legislation, or both. This trial might give us a dramatic story, but the future of social media governance probably won’t be written by a single jury in California. It’ll be a messy fight across every possible front.
