Verdicts Highlight the Role of Meta and YouTube’s Design in Harm to Minors
This week’s verdicts against Meta and YouTube in landmark social media trials in the US underscore the real harms users link to social media design – including addiction, lasting mental health impacts, and exposure to predators – and highlight the need for these companies to overhaul design choices that prioritize engagement over safety.
Washington, D.C. – For the first time, US juries have found that platform design choices by social media companies contribute to real-world harms impacting minors. Meta was found liable in New Mexico, while both Meta and YouTube were found liable in California, in two landmark trials – setting a precedent that platforms can be held accountable for the impacts of their product design.
These decisions come as Meta, YouTube, Snap, TikTok, and other major platforms face growing scrutiny and litigation in the US over harms linked to their design choices. These trials are part of increasing global attention to minors’ safety, including European Union (EU) efforts to regulate recommender systems, addictive design, and other engagement-driven features.
“For the very first time, juries concluded that platform design choices made by Meta and YouTube were a substantial factor in contributing to concrete harm,” said Peter Chapman, Associate Director at the Knight-Georgetown Institute. “With their internal documents and data now exposed to the public, social media platforms can no longer hide behind empty safety claims. These verdicts show that Meta and YouTube have failed to adequately ensure the safety of their products, exposing a significant gap between their claims and internal company evidence as reviewed by the juries.”
Amid these trials, KGI released a report showing the discrepancy between what Meta and TikTok know internally about the harms and risks associated with their platforms and what they report externally. By comparing internal documents emerging from US litigation with platform disclosures under the EU’s Digital Services Act (DSA), “Measuring Risk: What EU Risk Assessments and US Litigation Reveal About Meta and TikTok” finds a consistent gap between companies’ public safety claims and the internal data they collect on risk and mitigation effectiveness.
Some key findings from Measuring Risk:
- Internal data released in US litigation shows that safety tools such as screentime limits and “take a break” reminders have extremely low adoption rates, often below 2% of minor users.
- Meta’s internal projections accurately predicted that 99% of teens would not opt in to optional “take a break” features.
- In 2023, Meta internally found that Instagram’s “Accounts You Might Follow” product feature recommended adult groomers to nearly 2 million minors in 3 months, with 22% of these recommendations resulting in a follow request.
Litigation is providing new and critical insights into how platforms measure engagement, assess risk, and evaluate mitigation tools – insights that are largely absent from the companies’ public-facing communications.
“These trials have begun to pull back the curtain on how platforms make design decisions – and what they know about their potential impacts,” said Chapman. “Meta and YouTube didn’t just understand the risks, they meticulously tracked them. We cannot build effective tech policy in the dark. Regulators, researchers, and the public need to understand the data, testing, and trade-offs that inform platform design choices – not just to understand them, but to hold companies accountable for any harms they cause.”
/ENDS
Media contact: Julie Anne Miranda-Brobeck, Communications Director at the Knight-Georgetown Institute, jm3885@georgetown.edu