Big Tech Loses in Court: What the Social Media Addiction Verdicts Mean

Two juries delivered consecutive verdicts last week that could permanently reshape how social media platforms operate in the United States. On Tuesday, March 24, a New Mexico jury ordered Meta to pay $375 million for failing to protect children from predators on Instagram and Facebook. On Wednesday, March 25, a Los Angeles jury found both Meta and Google’s YouTube liable for designing their platforms to addict young users, awarding an additional $6 million in damages. Neither verdict is the final word, as both companies are appealing, but together they signal that the legal dam protecting Big Tech may finally be breaking.

What the Cases Were About

The Los Angeles case centered on a now-20-year-old California woman identified in court documents as K.G.M. and referred to at trial as Kaley. She said that Instagram and YouTube encouraged addictive use when she was a minor, contributing to depression, anxiety, body dysmorphia and suicidal thoughts. She started watching YouTube at age six and joined Instagram at nine, years before either platform’s minimum age of 13. By the time she finished elementary school, she had posted 284 videos on YouTube and described spending up to 16 hours in a single day on Instagram.

After a seven-week trial and more than 43 hours of deliberation, the jury found Meta 70% responsible for her harm and YouTube 30% responsible. The jury also found that both companies acted with malice, oppression or fraud, triggering punitive damages on top of the $3 million compensatory award. Meta was ordered to pay an additional $2.1 million in punitive damages and YouTube an additional $900,000, bringing the total to $6 million.

The New Mexico case proceeded on a separate but related theory: that Meta violated state consumer protection laws by misleading users about the safety of its platforms while knowingly enabling child sexual exploitation through Instagram and Facebook. That trial enters a second phase in May, in which a judge will decide whether Meta created a public nuisance and whether it must pay additional penalties and redesign its apps.

The Legal Strategy That Changed Everything

For years, social media companies defeated virtually every lawsuit by invoking Section 230 of the Communications Act, the federal provision that shields internet platforms from liability for content posted by third parties. The argument was straightforward: if a user posted something harmful, the platform was not responsible for it.

Plaintiffs’ attorneys found a way around it. Instead of targeting harmful content, they targeted harmful design. The Los Angeles trial focused on specific software features, including infinite scroll, autoplay video, algorithmic notification systems and engagement-maximizing beauty filters, arguing that these constitute defective product design regardless of what content they deliver. That theory does not implicate Section 230 at all, and the jury accepted it.

The internal documents introduced at trial gave the theory its teeth. One Meta memo read: “If we wanna win big with teens, we must bring them in as tweens.” Another showed that 11-year-olds were four times as likely to return to Instagram compared with competing apps, despite the platform’s stated minimum age of 13. A third document, written by a Meta employee, stated: “Oh my gosh yall IG is a drug… Lol, I mean, all social media. We’re basically pushers.” Under questioning, CEO Mark Zuckerberg told the jury that keeping young users safe has always been a company priority. The jury disagreed.

What It Means for Ordinary People and Parents

The $6 million verdict is a rounding error against Meta’s annual profits, and the market confirmed it. Meta’s stock rose on the day of the verdict, in part because the announcement coincided with Zuckerberg’s appointment to a new White House advisory council. Markets treated the damages as immaterial. But dollar amounts are not the point.

The Los Angeles case was a bellwether, a test case specifically selected to establish legal precedent and guide the resolution of thousands of related lawsuits already pending in courts across the country. As a Cornell University tech policy professor put it this week, “as this case goes, so might these others.”

For parents, the documents now in the public record are damning independent of any legal outcome. It is no longer a matter of suspicion or parental intuition that these platforms were engineered to hook children. A jury of twelve people heard the evidence and concluded that executives knew the harm they were causing and proceeded anyway.

The practical implication is that settlements across thousands of pending cases become far more likely. Once a defendant faces consistent jury verdicts, litigation risk calculations change. Legal experts are already drawing comparisons to the 1990s tobacco litigation, which ultimately forced industry-wide behavioral changes that no single case or regulator had been able to impose. Those comparisons now extend beyond the United States: legal experts have begun noting that the verdict could serve as the basis for class actions and individual claims on a global basis.

Meta has acknowledged the stakes in its own public filings. In its 2026 annual report filed with the SEC, the company warned investors that youth addiction lawsuits and mass arbitration demands could significantly impact its financial results.

Who Else Can Sue and What Is Already Filed

The litigation landscape is far broader than the two verdicts this week suggest. Here is who is already in the pipeline and who else may have claims.

Individual minors and young adults. Any person who used InstagramYouTubeTikTok or Snapchat as a minor and suffered documented mental health harm, including depression, eating disorders, self-harm or anxiety, may have a viable claim under the design-defect theory. Each plaintiff must still prove that their specific harm was causally linked to a specific platform, which requires medical documentation and expert testimony. The verdict does not create an automatic win, but it validates the legal theory and dramatically improves the odds for similarly situated plaintiffs.

State attorneys general. New Mexico’s verdict was brought under state consumer protection law, a separate and potentially more powerful track. California’s attorney general has his own trial scheduled for August. Other state AGs are watching. The consumer protection theory does not require proving addiction to any individual child; it requires showing that the company deceived consumers about safety. That is a lower bar with potentially broader reach.

School districts and federal plaintiffs. A federal trial is now expected to begin in June in Oakland, California, in the Northern District of California, involving consolidated claims by school districts and parents nationwide against Meta, YouTube, TikTok and Snap, with 235 plaintiffs already named. School districts are suing as institutional plaintiffs, claiming they have had to bear the cost of counselors, mental health interventions and disciplinary incidents caused by platform-driven harm to their students.

The pending cases. The Los Angeles bellwether was tied to hundreds of similar cases brought by more than 1,600 plaintiffs ranging from California school districts to families who say the platforms harmed their children. Those plaintiffs now have a favorable jury verdict as a reference point, which materially strengthens their settlement position even if they never reach trial.

TikTok and Snap. Both settled with Kaley just before trial concluded, for undisclosed amounts. Neither settlement resolves their exposure in other ongoing proceedings. Both companies still face hundreds of individual and consolidated claims in the federal multidistrict litigation and in state courts across the country.

What It Means for the Tech Industry

The design-defect theory, if it survives appeal, fundamentally reframes the liability exposure of every platform that uses engagement-maximizing algorithms. Autoplay, infinite scroll, notification engineering and algorithmic content ranking are not incidental features; they are the core of the business model. If those features constitute actionable product defects when applied to minors, the financial exposure across thousands of cases runs into the billions.

Some legal scholars have raised a counter-argument worth watching: in practice, there is no clean line between platform design and content. Autoplay only hooks users because of what it plays. Infinite scroll only retains users because of what it surfaces. If appellate courts eventually agree that the design-defect theory inevitably implicates content decisions, Section 230 could reassert itself and reverse the liability landscape.

At the federal legislative level, the Kids Online Safety Act has stalled, failing to advance through Congress in either 2024 or 2025. That gridlock has shifted momentum to the courts and to state attorneys general, who have proven more willing to act aggressively. Critics of KOSA continue to warn that broadly defining harmful content could sweep in LGBTQ+ and reproductive health information, a concern that has slowed progress on Capitol Hill.

The Bottom Line

Two consecutive jury verdicts in one week, totaling over $380 million and covering two independent legal theories, represent a meaningful inflection point. The companies are appealing, and appellate courts may yet reverse course. But the legal playbook that protected social media platforms for a decade has been successfully challenged, the internal documents are now public, and a wave of cases, in state courts, in federal court and potentially in courts around the world, is waiting in line.

If you are a parent, this week’s verdicts confirm what many families have suspected: these platforms were not passively indifferent to the harm they caused children. They engineered it. If you are a business that relies on social media platforms for marketing, distribution or operations, the potential for industry-wide design changes, whether driven by litigation, settlement or eventual legislation, is a business continuity question worth thinking through now.


You may also enjoy:

and if you like what you read, please subscribe below or in the right-hand column.