Recent jury verdicts against Meta Platforms Inc. and Alphabet Inc.’s Google are poised to redefine the legal landscape for major technology companies, signaling a potential new era of accountability for product design and user harm. Last week, two separate trials in Los Angeles and New Mexico delivered significant blows, challenging the long-held legal shields that have protected Silicon Valley giants.
In Los Angeles, a jury found Meta, owner of Instagram, and Google, owner of YouTube, liable for deliberately designing their applications to be addictive. This design, the jury determined, contributed to the mental health struggles of a young woman who began using the platforms as a child, resulting in a $6 million damages award. Concurrently, a New Mexico jury ordered Meta to pay the state $375 million for failing to adequately protect young users from child predators. New Mexico Attorney General Raúl Torrez has indicated plans to seek further penalties in a second phase of the trial, set for May, focusing on whether Meta created a public nuisance, and will also push for court-mandated changes to Meta’s app designs to enhance safety.
Cracks in the Legal Shield
For decades, Section 230 of the Communications Decency Act of 1996 has served as a formidable legal bulwark, largely shielding online platforms from liability for content posted by users. This protection often halted lawsuits against tech companies in their tracks. Carrie Goldberg, a lawyer specializing in tech accountability, experienced this firsthand in 2017 when her client, Matthew Herrick, sued the dating app Grindr. Herrick alleged his ex-boyfriend used fake profiles to harass him, but the case was dismissed under Section 230, despite Goldberg’s argument that Grindr offered a defective product by claiming inability to stop the harassment. Goldberg recounted, "We appealed and appealed and lost every appeal. And then the case was ultimately dismissed."
However, in the nine years since Herrick’s case, courts have shown increasing willingness to consider arguments that tech companies can be held accountable for their product design and monetization strategies. Goldberg herself successfully sued Omegle, a video chat site accused of enabling child sexual exploitation, in 2021, leading to a settlement and the site’s shutdown. The same year, an appeals court allowed a lawsuit against Snapchat to proceed over a speed filter implicated in deadly car crashes, rejecting Section 230 defenses. Snapchat later settled that case in 2023.
This evolving legal approach, often referred to as a product liability argument, draws parallels to the legal campaign waged against the Big Tobacco industry in the 1990s. Advocates for tech accountability have explicitly embraced this playbook. Goldberg asserts that decisions regarding how apps function and generate revenue "are things that, in my mind, the platform should be liable for if they get it wrong and injure somebody." She declared, "This is the dawn of a new era, with people finally getting to hold tech platforms responsible for the harms they cause."
Industry Response and Expanding Litigation
Both Meta and Google have announced intentions to appeal the recent verdicts. Meta contends that teen mental health issues cannot be attributed to a single application, while Google maintains that YouTube is not primarily a social media platform. Legal experts widely anticipate that this novel legal theory of liability will ultimately be reviewed by the Supreme Court.
Despite the impending appeals, the recent verdicts have already galvanized further legal action. Thousands of related cases against social media platforms are currently progressing through state and federal courts. The scope of this litigation is also broadening beyond traditional social media. Credit rating agency Moody’s reports more than 4,000 pending cases targeting 166 companies, all alleging addictive software design.
New lawsuits are emerging against makers of video games, online gambling applications, and artificial intelligence chatbots. A day after the Meta and YouTube verdict, a lawsuit was filed in Massachusetts state court accusing sports betting sites DraftKings and FanDuel of fostering gambling addiction through designs that encourage compulsive use. Jennifer Hoekstra, a partner at Aylstock, Witkin, Kreis & Overholtz, representing the plaintiff, explained the personalized nature of these apps: "It’s personalized itself to you. If you don’t log in for 72 hours, it starts telling you, ‘Hey, if you made a bet on this match, you could have made this much money.’" DraftKings has stated it will "vigorously defend against these lawsuits."
Furthermore, Matthew Bergman of the Social Media Victims Law Center, which represented the plaintiff in the LA trial, has also filed lawsuits against OpenAI and other AI chatbot developers, alleging their products contribute to mental health crises and suicides. OpenAI has acknowledged the "incredibly heartbreaking situation" and stated it is collaborating with mental health experts to enhance its chatbot’s responses to signs of distress.
Beyond the Courtroom: Driving Systemic Change
Advocates are hopeful that these initial legal successes will generate momentum beyond the courtrooms, pushing for the passage of long-stalled tech regulation and fostering systemic change within Silicon Valley. Sarah Gardner, who leads the Heat Initiative, an advocacy group focused on online child safety, observed that the verdicts have "just created a different playing playing field than we had even a few months ago." She draws parallels to the tobacco industry’s transformation, noting, "If you go and look at what really changed the tobacco industry, it wasn’t one thing, it was everything together." Gardner emphasizes the need to "create enough pressure that it will change the business incentives."
Matthew Bergman reinforces this sentiment, stating, "The only way they’re going to change their behavior is if you internalize the cost of safety." While the financial damages levied against Meta and Google thus far are relatively small compared to their multitrillion-dollar valuations, Bergman believes these initial verdicts send an unequivocal message to the tech industry: "If you grab them by the pocketbook, their hearts and minds will follow." The growing legal pressure, coupled with increasing public scrutiny and regulatory calls, suggests that tech companies may face an escalating imperative to prioritize user safety and well-being in their product development and business models, fundamentally altering their operational calculus.


