KGM v. Meta & Google/YouTube

By Tom Berg

February 18, 2026

About this collection

The case of KGM — identified in court documents as Kaley G.M., a young woman now approximately 20 years old — sits at the center of one of the most consequential legal battles in the history of social media. KGM alleges that Meta (owner of Instagram and Facebook) and Alphabet/Google (owner of YouTube) deliberately engineered their platforms to be addictive, specifically targeting children and teenagers, and that her compulsive use of these platforms beginning in early childhood caused her to develop depression, anxiety, body dysmorphia, suicidal thoughts, and other serious mental health harms. [1][2] KGM's case has been selected as a **bellwether trial** — one of three test cases drawn from a massive consolidated group of lawsuits known as **JCCP 5255**, representing thousands of plaintiffs across the United States. The outcome of her trial in Los Angeles Superior Court is expected to set legal and financial precedents that will shape how hundreds of similar cases proceed, potentially forcing tech giants to fundamentally redesign their platforms and exposing them to billions of dollars in liability. [3][2] The trial, which began with jury selection on January 27, 2026, and opening statements on February 9, 2026, is expected to last six to eight weeks. [4][1] KGM's lawsuit argues that the companies borrowed techniques from slot machines and the tobacco industry — using dopamine-manipulating algorithms, infinite scroll, autoplay, push notifications, and social comparison features like "Likes" — to maximize youth engagement and advertising revenue, while knowingly concealing the mental health risks from the public. [1][3] The case exists within a broader national reckoning: more than 40 state attorneys general have filed related lawsuits against Meta, a parallel trial in New Mexico accuses Meta of enabling child sexual exploitation, and a federal bellwether trial representing school districts is scheduled for June 2026 in Oakland. [5][6]

Curated Sources

KGM v Meta Cast and Timeline

The case of KGM v. Meta & Google/YouTube centers on a young woman (Kaley G.M.), approximately 20 years old at trial, who alleges that Meta (Facebook/Instagram) and Alphabet/Google (YouTube) deliberately engineered their platforms to be addictive, specifically targeting children and teenagers. KGM claims this compulsive use caused severe mental health harms including depression, anxiety, body dysmorphia, and suicidal thoughts. This case is a bellwether trial selected from JCCP 5255, a massive consolidated group of lawsuits representing thousands of plaintiffs across the U.S. The trial in Los Angeles Superior Court began with jury selection on January 27, 2026 and opening statements on February 9, 2026, with an expected duration of six to eight weeks. KGM's lawsuit argues Meta and Google used techniques borrowed from slot machines and the tobacco industry—such as dopamine-manipulating algorithms, infinite scroll, autoplay, push notifications, and social comparison features like "Likes"—to maximize youth engagement and advertising revenue while concealing mental health risks. The case is part of a broader national reckoning, with over 40 state attorneys general filing related lawsuits against Meta, a parallel trial in New Mexico accusing Meta of enabling child sexual exploitation, and a federal bellwether trial for school districts scheduled for June 2026. Key characters include KGM (the plaintiff), Mark Lanier (plaintiff's lead attorney known for theatrical style), Paul Schmidt (Meta's trial attorney), Luis Li (YouTube/Google's trial attorney), Mark Zuckerberg (Meta CEO who testified), Adam Mosseri (Head of Instagram), Neal Mohan (YouTube CEO expected to testify), and Judge Carolyn B. Kuhl presiding over the trial. Meta and Google deny allegations, with Meta citing a "longstanding commitment to supporting young people" and Google claiming allegations are "simply not true." The trial features internal company documents, health records, and testimony from mental health providers to contest causation. The case raises critical legal questions about whether platform design defects—not just content—can be actionable under product liability theories, with potential implications for Section 230 protections. A plaintiff victory could open the floodgates to more litigation against tech companies for addictive design features targeting minors.

Key Takeaways

  • The KGM trial is a bellwether case from JCCP 5255 that could set legal precedents affecting thousands of lawsuits against social media platforms for designing addictive features targeting children
  • Plaintiff's attorney Mark Lanier frames the case as 'Addicting the Brains of Children' (ABC), using internal Meta documents showing Zuckerberg set goals to increase time spent on platforms by 12% and evidence of 4 million underage Instagram users
  • Defendants argue KGM's mental health issues stemmed from her difficult childhood and pre-existing conditions, not platform design, presenting her health records and testimony from her mental health provider
  • The trial tests whether platform design defects—not just user-generated content—are actionable under product liability theories, with potential implications for Section 230 protections
  • A plaintiff victory could force fundamental redesigns of social media platforms and expose tech giants to billions in liability, while a defense victory would likely shield platforms under current legal frameworks

What Does the First US Social Media Addiction Trial Mean for the Tech Industry? | TechPolicy.Press

This article covers the landmark first US social media addiction trial set to begin in Los Angeles Superior Court, focusing on whether social media companies can be held liable for platform design rather than just user-generated content. The case involves plaintiff K.G.M. and her mother suing Meta, YouTube, Snap, and TikTok, alleging that algorithmic features like "infinite scroll" and "autoplay" caused harm. Snap and TikTok settled, leaving Meta and Google as defendants. The trial is part of a Judicial Council Coordination Proceeding (JCCP) consolidating approximately 1,600 plaintiff cases nationwide. Key legal questions center on whether design features constitute actionable conduct separate from protected content under Section 230 of the Communications Decency Act. Internal Meta communications revealed during discovery show company employees comparing Instagram to "a drug" and referencing "Reward Deficit Disorder" among users. A 2019 Instagram presentation described teens viewing the platform through an "addict's narrative," spending excessive time despite recognizing negative impacts. Legal experts note high stakes: successful plaintiff arguments could fundamentally change how social media platforms serve content to minors by weakening Section 230's protective scope. The trial will test whether courts treat addictive design as distinct from content moderation. Defense arguments may focus on Section 230 immunity, First Amendment concerns, and unsettled science around social media addiction. Plaintiffs will emphasize internal documents showing companies prioritized engagement over safety despite knowing harms. The case gained prominence through unsealed internal communications and research, with Meta CEO Mark Zuckerberg expected to testify. For plaintiffs like Julianna Arnold—whose daughter died after encountering fentanyl sellers on Instagram—the trial represents a search for transparency and accountability. The outcome could influence future legislation requiring safer product designs for young users.

Key Takeaways

  • Successful plaintiff arguments could weaken Section 230 by establishing that addictive design features constitute actionable conduct distinct from user-generated content, potentially changing how platforms serve content to minors
  • Internal Meta communications comparing Instagram to 'a drug' and referencing 'Reward Deficit Disorder' provide powerful evidence that companies understood their platforms' addictive nature, undermining 'neutral platform' defenses
  • The trial tests whether courts will treat design features like infinite scroll and autoplay as liable for harm, drawing parallels to Big Tobacco cases where companies knew product addictiveness
  • High-profile testimony from Meta CEO Mark Zuckerberg could significantly influence jury perception, making his credibility a critical factor in the case outcome
  • Plaintiffs aim to shift legal focus from content to product design, seeking precedent that would require social media companies to prioritize user safety over engagement metrics

AG Campbell Files Lawsuit Against Meta, Instagram For Unfair And Deceptive Practices That Harm Young People | Mass.gov

Massachusetts Attorney General Andrea Joy Campbell announced a lawsuit against Meta Platforms Inc. and Instagram for violating consumer protection laws by designing addictive features that exploit young users' vulnerabilities. The bipartisan coalition of 42 attorneys general allege Meta knowingly created and maintained features like infinite scroll, autoplay Stories/Reels, and intermittent variable rewards that hook young users into excessive platform use. Internal research showed Meta understood these features caused significant harm to teen mental health—including decreased happiness, increased depression and self-harm—but chose to hide this knowledge and mislead the public to protect profits. The lawsuit specifically cites Meta's failure to implement effective age verification for under-13 users despite knowing children were actively using Instagram, and its continued claims that the platform was safe for youth. The complaint argues these practices have contributed to Massachusetts' mental health crisis among teens, overloading school systems and healthcare resources. The action follows a 2021 nationwide investigation co-led by Massachusetts into Instagram's impact on young people, with 8 states filing similar state court complaints and 33 joining a federal lawsuit.

Key Takeaways

  • Meta's design choices intentionally exploited adolescent psychology through slot-machine-like reward systems and FOMO triggers, creating uncontrollable usage patterns
  • Internal documents prove Meta concealed knowledge of platform harms to teen mental health while publicly claiming safety priorities
  • The lawsuit highlights systemic failure in age verification, allowing under-13 users onto a platform Meta knew was most damaging to the youngest users
  • These practices have created measurable public health costs for Massachusetts, straining schools and healthcare systems to address youth mental health crises
  • The coordinated multistate legal strategy represents unprecedented attorney general action against Big Tech's youth-targeting business models

Instagram and YouTube owners built 'addiction machines', trial told

The landmark trial in Los Angeles examines whether Instagram and YouTube created "addiction machines" targeting children, with Meta and Google facing allegations of deliberate platform design causing mental health harm. Plaintiff KGM's lawyers presented evidence including a 2015 email from Mark Zuckerberg demanding a 12% increase in "time spent" to meet business goals, and accused YouTube of using "digital babysitting service" tactics to exploit busy parents. Meta's defense argued KGM's mental health struggles stemmed from family abuse and neglect rather than platform use. The case could set precedents for similar lawsuits nationwide, with state attorneys general pushing for regulatory changes including age restrictions, disabling "addictive" features like infinite scroll, and deleting data from under-13 users. The trial features testimony from Zuckerberg, Instagram head Adam Mosseri, YouTube CEO Neal Mohan, and former Meta whistleblowers. Snap and TikTok settled with KGM before trial, leaving Meta and Google as primary defendants.

Key Takeaways

  • Platform design choices are being scrutinized as potential public health threats to children
  • The trial could establish legal precedents affecting how social media companies design youth-targeted features
  • State attorneys general are pushing for sweeping regulatory changes including disabling 'addictive' features
  • Evidence presented includes internal corporate communications showing business goals prioritized over user wellbeing
  • Whistleblower testimony and expert analysis will be central to determining platform liability

SUPERIOR COURT OF CALIFORNIA, COUNTY OF LOS ANGELES

This court ruling addresses multiple motions for summary judgment filed by Meta, Google, ByteDance, and Snap Inc. in a lawsuit brought by K.G.M. and others against major social media platforms. The lawsuit alleges negligence and negligent failure to warn related to the platforms' design features that allegedly contributed to the plaintiffs' mental health harms. The court denies all motions, finding that there are genuine issues of material fact that preclude summary judgment. Key findings include: - Meta's motion is denied because there are factual disputes about whether Meta's design features (like "infinite scroll") caused harm, and whether warnings should have been provided. The court rejects Meta's arguments that Section 230 of the Communications Decency Act or the First Amendment bar the claims. - Google's motion is denied because there are factual disputes about whether YouTube's design features (like "autoplay") caused harm, and whether warnings should have been provided. The court rejects Google's arguments that Section 230 or the First Amendment bar the claims. - ByteDance's motion is denied because there are factual disputes about whether TikTok's design features (like "endless scrolling") caused harm, and whether warnings should have been provided. The court rejects ByteDance's arguments that Section 230 or the First Amendment bar the claims. - Snap's motion is denied because there are factual disputes about whether Snapchat's design features caused harm, and whether warnings should have been provided. The court rejects Snap's arguments that Section 230 or the First Amendment bar the claims. The court emphasizes that causation is a factual issue for the jury to decide, and that the plaintiffs have presented sufficient evidence to create genuine issues of material fact regarding whether the defendants' design features caused harm and whether warnings should have been provided.

Key Takeaways

  • The court found genuine issues of material fact regarding whether social media design features caused plaintiffs' mental health harms, precluding summary judgment
  • The court rejected arguments that Section 230 of the Communications Decency Act or the First Amendment bar claims based on platform design features
  • The court emphasized that causation is a factual issue for the jury to decide, with plaintiffs presenting sufficient evidence to create disputes of material fact
  • The ruling allows claims against Meta, Google, ByteDance, and Snap Inc. to proceed to trial regarding negligence and negligent failure to warn
  • The decision highlights the evolving legal landscape around platform liability for design features that may contribute to user harm

YouTube Argues It Isn’t Social Media in Landmark Tech Addiction Trial - The New York Times

YouTube argued in a landmark social media addiction trial that it is not a social media platform but rather an entertainment service more akin to Netflix. Lawyers for the company claimed during opening statements that YouTube's video recommendation system is designed to help users find content they enjoy, not to create addictive behaviors. The case centers on a lawsuit filed by a 20-year-old California woman identified as K.G.M., who alleges that YouTube and Meta's Instagram created addictive apps that harmed her mental health through features comparable to slot machines. YouTube's defense emphasized that features like infinite scroll and video recommendations are user-friendly tools that can be disabled in settings, and presented evidence showing the plaintiff spent minimal time on potentially addictive features. The trial marks the first in a series of lawsuits against major tech companies testing whether social media platforms can be held liable for causing addiction comparable to traditional addictive substances. A win for the plaintiff could lead to significant damages and potential design changes for social media apps. The case highlights ongoing debates about tech company liability, algorithmic design, and the psychological impact of digital platforms on users, particularly youth.

Key Takeaways

  • YouTube's defense strategy hinges on repositioning itself as an entertainment platform rather than social media, arguing its features serve user preferences rather than create addiction

Social media giants face trial over claims they harm kids : NPR

Meta (owner of Instagram and Facebook) and Google's YouTube are heading to trial in California state court over allegations their platforms contributed to youth mental health crises through intentionally addictive design features. The case marks the first time such claims will be heard by a jury, with TikTok having settled confidentially on the eve of trial. The lawsuit, brought by teenager K.G.M. (now 19), alleges that infinite scroll, auto-play videos, frequent notifications, and recommendation algorithms created a "compulsion to engage" that led to depression, anxiety, and body dysmorphia. The trial will examine internal company documents, expert testimony, and the plaintiff's personal account of excessive social media use starting at age 10. Meta CEO Mark Zuckerberg and Instagram head Adam Mosseri are expected to testify. The outcome could reshape social media design and set precedents for holding tech companies legally accountable for product features rather than user-generated content. Defendants argue no clinical diagnosis exists for social media addiction and cite First Amendment protections for content decisions. They also point to recent safety features like parental controls and time limits. The case navigates Section 230 immunity by focusing on platform design rather than specific posts. If successful, plaintiffs could secure damages and force design changes across the industry, potentially altering the internet landscape.

Key Takeaways

  • First jury trial testing whether 'addictive design' features can be legally liable under youth mental health claims
  • Outcomes could force fundamental changes to social media UX patterns like infinite scroll and auto-play
  • Case bypasses Section 230 by targeting platform mechanics rather than user content
  • Potential precedent-setting impact extends beyond Meta/YouTube to entire tech industry
  • Highlights growing legal pressure mirroring historical Big Tobacco campaigns

Social media 'addicting the brains of children,' plaintiff's lawyer argues in landmark trial

A landmark social media trial in Los Angeles is testing whether Meta and YouTube can be held responsible for harms to children, with the case of a 20-year-old plaintiff identified as KGM serving as a bellwether. The trial features dueling narratives: plaintiffs' attorneys argue Meta and Google have "engineered addiction in children's brains" through addictive features like infinite scroll, while company lawyers push back against the concept of social media addiction and emphasize user control and parental supervision. Plaintiffs' attorney Mark Lanier presented internal Meta documents and studies, including "Project Myst," which showed Meta knew children experiencing trauma were particularly vulnerable to addiction and that parental controls had limited impact. He compared social media companies to tobacco firms and casinos, citing internal communications where Meta employees described Instagram as "like a drug" and "pushers." Meta and YouTube attorneys countered that KGM is not addicted to their platforms, pointing to her sworn testimony and medical records that show no diagnosis of addiction. YouTube's attorney Luis Li highlighted that KGM's average daily watch time on YouTube Shorts was only 1 minute and 14 seconds, and that all features Lanier challenged could be disabled or modified by users. The outcome could have profound effects on how social media companies handle children on their platforms and may set precedents for thousands of similar lawsuits. Additional trials are scheduled across the country, including a New Mexico case accusing Meta of failing to protect young users from sexual exploitation and a federal bellwether trial in June representing school districts suing over child harms.

Key Takeaways

  • The trial presents a critical test of whether social media platforms can be held liable for designing features that may addict children, with implications for thousands of similar lawsuits nationwide.
  • Internal Meta documents reveal the company understood the vulnerability of traumatized children to addiction and the limited effectiveness of parental controls, raising ethical concerns about platform design choices.
  • YouTube and Meta defend themselves by emphasizing user control over features like infinite scroll and pointing to KGM's own testimony denying addiction, framing the issue as individual usage rather than platform design.
  • The case highlights broader legal and regulatory pressures on tech companies, with multiple trials and state attorney general lawsuits targeting social media's impact on youth mental health and safety.
  • Potential outcomes range from significant liability for the platforms to reinforced arguments about user autonomy, with the trial serving as a bellwether for how courts may approach similar claims.

'It's not an addiction': YouTube pushes back in landmark social media trial | Courthouse News Service

In a landmark trial, Google attorney Luis Li argued that YouTube is not a social media platform and therefore cannot contribute to social media addiction in children. The case involves plaintiff Kaley GM, now 20, who claims YouTube and other platforms worsened her depression and anxiety. Li contended that YouTube functions more like a streaming service (comparing it to Netflix) rather than a social network, and presented evidence showing Kaley averaged only 29 minutes of YouTube use per day from 2020-2024, with medical records containing just one mention of YouTube use for sleep aid. The trial is part of a larger consolidated action involving 1,600 plaintiffs suing Meta, Google, Snap, and TikTok. Plaintiff attorney Mark Lanier plans to show evidence that tech companies "built machines designed to addict the brains of children." Expert witness Dr. Anna Lembke testified that social media "druggifies" human connection, creating compulsive usage patterns that can become a new source of trauma. The case follows a 2024 U.S. surgeon general report warning of social media's "profound risk of harm" to adolescent mental health. Meta's attorney argued Kaley's mental health struggles stemmed from childhood abuse rather than platform use. The trial could influence global settlements and potential regulation of social media platforms.

Key Takeaways

  • YouTube's defense strategy hinges on distinguishing itself from interactive social platforms by framing itself as a content-streaming service rather than a social network
  • The case highlights the tension between platform design choices and potential mental health impacts, with experts drawing parallels between social media use and substance addiction
  • A key legal question emerges: whether platform classification (social vs. streaming) affects liability for alleged addictive design features
  • The trial could set precedent for regulating how platforms engage young users, potentially influencing warning labels and design restrictions
  • Medical records showing minimal YouTube usage challenge addiction claims, suggesting other factors may underlie plaintiff's mental health struggles

Meta, YouTube face trial over allegations their tech is addictive, as TikTok settles - CBS News

Meta and YouTube face trial this week over allegations their platforms are deliberately designed to be addictive and harm children, with TikTok settling the case. The lawsuit, brought by a 19-year-old plaintiff identified as KGM, claims social media use from a young age caused depression and suicidal thoughts due to intentional design choices aimed at boosting engagement and profits. The case could serve as a bellwether for over a thousand similar lawsuits against social media companies. If successful, it could bypass tech companies' First Amendment protections and Section 230 liability shield by proving deliberate harm through product design. The trial, set for Los Angeles County Superior Court, will last six to eight weeks with Meta CEO Mark Zuckerberg expected to testify. Legal experts compare the case to Big Tobacco trials that led to massive settlements. Meta and YouTube deny the allegations, citing safeguards and arguing mental health is multifaceted. The case is part of a wave of litigation including school district lawsuits and state attorney general actions against Meta for allegedly harming youth mental health through addictive features on Instagram and Facebook. TikTok faces similar lawsuits in multiple states.

Key Takeaways

  • This trial could establish precedent for over 1,000 similar cases against social media companies, potentially forcing platform redesigns
  • If successful, plaintiffs could bypass Section 230 protections by proving deliberate harm through product design rather than content liability
  • The case mirrors Big Tobacco litigation strategies, with potential for massive settlements if companies are found liable for intentional design choices
  • Meta and YouTube defend themselves by citing safeguards and arguing mental health is complex and multifactorial
  • Over 40 state attorneys general have filed similar lawsuits against Meta for allegedly designing addictive features targeting children

Meta sued by states for harming young people's mental health | AP News

Dozens of US states, including California and New York, are suing Meta Platforms Inc. for knowingly designing features on Instagram and Facebook that addict children to its platforms, contributing to the youth mental health crisis. The lawsuit, filed by 33 states in federal court in California with additional state actions bringing the total to 41 states plus Washington D.C., alleges Meta routinely collects data on children under 13 without parental consent in violation of federal law. The complaint claims Meta has misled the public about dangers while exploiting vulnerable teens for profit. Specific harms cited include Meta's internal research showing Instagram worsens suicidal thoughts in 13.5% of teen girls and eating disorders in 17% of teen girls. The suits seek financial damages, restitution, and an end to Meta's unlawful practices. Meta responded by stating it shares attorneys general's commitment to teen safety and has implemented over 30 tools to support teens and families, expressing disappointment with the legal approach instead of industry collaboration. The lawsuit follows previous revelations from Meta's own research and whistleblower documents that exposed the company's awareness of platform harms to youth mental health and body image issues.

Key Takeaways

  • Meta faces legal action from 41 states and DC over alleged design choices that addict children to Instagram and Facebook, contributing to youth mental health crisis
  • Lawsuit alleges Meta violates Children's Online Privacy Protection Act by collecting data on under-13 users without parental consent
  • Internal Meta research cited in lawsuit shows platform harms including worsened suicidal thoughts and eating disorders among teen girls
  • Attorneys general argue Meta prioritizes profit over people by designing manipulative features that lower children's self-esteem
  • Meta claims it has implemented over 30 teen safety tools but expresses disappointment with legal approach instead of industry collaboration

Social media companies face landmark trial over youth addiction claims | AP News

TikTok has agreed to settle a landmark social media addiction lawsuit just before trial began, joining Snap Inc. which settled last week. The case, involving Meta's Instagram and Google's YouTube, alleges these platforms deliberately designed features to addict children and harm their mental health. The lawsuit, centered around plaintiff KGM (19), claims social media platforms used behavioral techniques similar to slot machines to maximize youth engagement and advertising revenue. If successful, the case could bypass tech companies' First Amendment protections and Section 230 liability shield. Jury selection for the Los Angeles trial begins this week, with Meta CEO Mark Zuckerberg expected to testify. The outcome could impact how social media companies handle youth users and influence hundreds of similar lawsuits nationwide. Meta and Google deny the allegations, citing safeguards and arguing mental health is multifaceted. The case mirrors Big Tobacco trials and precedes federal bellwether trials for school districts and state attorney general lawsuits against Meta.

Key Takeaways

  • TikTok's pre-trial settlement avoids immediate legal battle but remains a defendant in other personal injury cases
  • The case tests whether social media design choices that addict children can bypass Section 230 liability protections
  • Outcome could set precedent for hundreds of similar lawsuits against Big Tech companies
  • Trial mirrors Big Tobacco litigation strategies with potential for massive financial and regulatory consequences
  • Meta and Google maintain their platforms protect youth, arguing mental health issues are complex and multifactorial

Meta accused in New Mexico trial of failing to protect children online | AP News

A New Mexico lawsuit has accused Meta Platforms Inc., parent company of Facebook and Instagram, of failing to protect children from sexual exploitation online. The trial, the first stand-alone case brought by state prosecutors against major social media companies regarding child safety, began with prosecution attorneys alleging Meta misrepresented platform safety while prioritizing profits over user protection. Prosecutors claim Meta knew its platforms enabled predators to target children but concealed this information from users and regulators. Evidence presented includes internal communications showing Meta executives understood the risks of underage users being exposed to harmful content. The state argues Meta's algorithms are engineered to maximize engagement at the expense of child safety, with prosecutors citing data showing approximately 500,000 inappropriate interactions with children occur daily on Meta platforms. Meta's defense contends it has implemented extensive safety measures and properly disclosed platform risks to users. The case could have significant implications for Section 230 of the Communications Decency Act, which grants tech companies liability protection for user-generated content. New Mexico Attorney General Raúl Torrez, leading the lawsuit, seeks stronger age verification systems, algorithm changes to reduce exposure to harmful material, and increased monitoring of encrypted communications. Meta has criticized the investigation as 'ethically compromised,' arguing prosecutors misused child proxy accounts and disposed of critical evidence.

Key Takeaways

  • This marks the first state-level trial against Meta regarding child safety, potentially setting precedents for future litigation against social media companies
  • Prosecutors allege Meta concealed knowledge of widespread child exploitation while prioritizing engagement metrics and profits
  • The case challenges Section 230 protections by arguing platforms should be held accountable for algorithmic design choices that enable harm
  • New Mexico seeks systemic changes including enhanced age verification, algorithm modifications, and monitoring of encrypted communications
  • Meta defends its safety measures while criticizing the state's investigation methods as ethically flawed

Google, Meta, push back on addiction claims in landmark social media trial - The Business Journal

Jurors in a landmark Los Angeles trial are examining claims that Meta (owner of Instagram) and Google (owner of YouTube) are responsible for harms to children through addictive social media features. The case centers on a 20-year-old plaintiff identified as KGM, whose experience could shape thousands of similar lawsuits. Lawyers for the plaintiffs compare platforms to casinos and drugs, presenting internal company documents that allegedly show awareness of addiction risks, including Meta's "Project Myst" study and internal communications likening products to "pushers." Meta and Google defend themselves by citing scientific disagreement over social media addiction, pointing to KGM's limited daily usage (29 minutes average, with just 1:14 on YouTube Shorts), and emphasizing that all features can be modified or disabled. Meta's attorney argues that KGM's mental health struggles stemmed from childhood trauma, abuse, and relationships rather than platform use. The trial, expected to last six to eight weeks, includes testimony from executives like Meta CEO Mark Zuckerberg and could significantly impact how tech companies design and moderate content for young users. Similar cases are proceeding in New Mexico and Oakland, with additional lawsuits from state attorneys general across the U.S. against Meta and TikTok.

Key Takeaways

  • The trial hinges on whether platforms were a 'substantial factor' in KGM's mental health struggles, with Meta arguing her issues originated from childhood trauma rather than social media use.
  • Internal company documents presented as evidence suggest awareness of addiction risks among teens, including Meta's Project Myst study and comparisons of products to casino mechanics.
  • YouTube and Google emphasize that KGM's actual usage was minimal (29 minutes/day average, 1:14 on Shorts) and that all features can be modified or disabled to align with user preferences.
  • The outcome could set precedents for thousands of similar lawsuits against tech companies, influencing future design and moderation practices for youth users.
  • Multiple parallel cases are underway, including New Mexico trials on sexual exploitation and Oakland federal cases representing school districts, signaling a broader legal reckoning for social media companies.

The Social Media Addiction Trials: What to Know - The New York Times

This article covers landmark trials testing a new legal strategy claiming Meta, TikTok, Snap, and YouTube caused personal injury through addictive products. The lawsuits argue that social media features like infinite scrolling, algorithmic recommendations, and autoplay videos lead to compulsive use and harm young users, causing issues such as depression, anxiety, eating disorders, and self-harm. The first trial, ongoing in Los Angeles, involves a 20-year-old woman (K.G.M.) who claims addiction to YouTube, Instagram, TikTok, and Snapchat from ages 8 to 11 resulted in mental health problems. Her lawyers compare the platforms to "digital casinos" profiting off addictive behavior. Nine cases are bundled as bellwethers in Los Angeles, with federal cases in Oakland focusing on public nuisance claims from school districts and states. A New Mexico trial addresses Meta's alleged facilitation of child exploitation. The companies defend themselves by citing lack of scientific proof linking tech use to addiction and invoking Section 230 of the Communications Decency Act, which shields them from liability for user-generated content. Meta argues K.G.M.'s mental health issues stemmed from familial abuse, while YouTube claims it is not a social media company and its features aren't designed to be addictive. The outcome could force design changes and open the door to millions of claims, potentially impacting the companies' business models.

Key Takeaways

  • The trials represent a novel legal strategy comparing social media to Big Tobacco, arguing platforms are defectively designed to be addictive and cause personal injury.
  • A win for plaintiffs could trigger an avalanche of similar claims and force costly design changes to social media platforms, potentially disrupting their business models.
  • The cases hinge on proving a scientific link between social media use and addiction, with companies defending themselves using Section 230 immunity and arguing no clear causation exists.
  • K.G.M.'s case as a bellwether trial could set precedents for thousands of similar lawsuits filed by individuals, school districts, and state attorneys general.
  • The outcome may reshape how social media platforms design and operate, particularly regarding features impacting young users and potential public health consequences.

What legal experts say about a major 'bellwether trial' over child social media addiction | PBS News

A landmark legal case is underway in Los Angeles Superior Court, testing whether major social media companies can be held liable for designing products that allegedly addict children. The case, known as JCCP 5255, centers on plaintiff K.G.M., a California woman who claims she became addicted to platforms like YouTube, Instagram, Snapchat, and TikTok starting at age six, leading to severe mental health issues including depression, anxiety, body dysmorphia, and self-harm. Jury selection began January 27, 2026, with opening arguments expected soon. This "bellwether trial" represents thousands of plaintiffs accusing Meta (Facebook/Instagram), YouTube (Google), Snap (Snapchat), and ByteDance (TikTok) of intentionally embedding addictive design features—such as endless scrolling and algorithmic recommendations—to maximize youth engagement and ad revenue. Snap and TikTok have already settled, leaving Meta and YouTube as defendants. The lawsuit argues these features mirror behavioral techniques used by slot machines and the cigarette industry, targeting minors as a core market to ensure future adult users. Legal experts note the case hinges on proving causation between platform design and addiction, with social media companies disputing the very existence of "social media addiction" as a clinical condition and arguing that any harm stems from user-generated content protected under the First Amendment and Section 230 of the Communications Decency Act. Potential outcomes include a verdict against the defendants, further settlements, or a defense victory—all of which could shape future litigation and policy. The trial is being watched closely as it may influence how courts treat Section 230, particularly whether addictive product design can be considered distinct from protected content moderation. The case arrives amid growing global regulatory pressure, including recent bans on under-16 social media use in Australia and proposed similar laws in Denmark and France.

Key Takeaways

  • The trial tests whether social media companies can be held liable for product designs that allegedly addict children, potentially reshaping legal interpretations of Section 230 and the First Amendment.
  • A verdict could establish precedent for thousands of similar claims, influencing future litigation strategies and regulatory approaches to youth mental health risks.
  • Global trends show increasing government intervention, with Australia banning under-16 social media use and Denmark/France considering similar laws, signaling a shift toward stricter child protection measures.

Landmark trial accusing tech giants of harming children with addictive social media begins | PBS News

The landmark trial beginning in Los Angeles County Superior Court marks the first time major social media companies face jury trials over allegations they deliberately designed platforms to addict and harm children. Meta (parent company of Instagram) and Google (owner of YouTube) are accused of engineering features that maximize youth engagement to boost advertising revenue, with internal documents cited showing explicit targeting of young children. The case centers on a 19-year-old plaintiff (KGM) who claims social media addiction caused severe mental health issues, including depression and suicidal thoughts. Lawyers for the plaintiffs compared the situation to Big Tobacco trials, arguing that design choices like 'like' buttons cater to minors' need for social validation. The trial, expected to last six to eight weeks, could set precedents for hundreds of similar lawsuits nationwide. TikTok and Snap settled earlier. The outcome may test whether companies can invoke Section 230 protections against liability for user-generated content. Executives including Meta CEO Mark Zuckerberg are expected to testify. Similar trials are underway in New Mexico (alleging failure to protect against sexual exploitation) and Oakland (school districts suing over child harms). The case comes amid growing global regulatory pressure, with countries like France, Australia, and Britain considering or implementing age restrictions on social media use.

Key Takeaways

  • The trial represents a pivotal legal test of whether social media companies can be held liable for design choices that allegedly addict children, potentially bypassing Section 230 protections.

Why are Meta, YouTube on trial? Explaining the youth addiction case.

A landmark trial has begun in Los Angeles Superior Court against Meta and Alphabet (parent company of YouTube), alleging their social media platforms—including Instagram and YouTube—are intentionally designed to addict children, causing severe mental health harm. The case centers on plaintiff Kaley G.M., a 20-year-old woman who claims she became addicted to these apps during her teenage years due to their attention-grabbing design, leading to depression and other mental health issues. Her lawsuit argues that Meta and Google failed to warn users about the risks of their platforms and that their algorithms were a substantial factor in her injuries. Opening statements began February 9, 2026, with Meta’s attorney citing Kaley’s pre-existing mental health history, while plaintiff counsel Mark Lanier plans to present internal company documents allegedly showing intentional design choices to maximize user engagement. Mark Zuckerberg is expected to testify, and the trial may extend into March. If found liable, the jury could award damages for pain and suffering and impose punitive damages. This case is one of three “bellwether” trials selected from hundreds of similar lawsuits against major social media companies. Snapchat and TikTok, initially named in the suit, settled before trial. Legal experts warn that a verdict favoring the plaintiff could trigger a wave of further litigation against tech giants. The outcome will test whether companies can be held legally responsible for alleged harm caused by their product design rather than the content users consume.

Key Takeaways

  • This trial marks the first time Meta and Alphabet will face direct courtroom scrutiny over claims their platforms are intentionally designed to addict youth, setting a potential precedent for future liability.
  • The case hinges on whether internal company documents prove deliberate design choices to maximize engagement at the expense of child mental health—a claim plaintiff attorneys plan to emphasize.
  • A favorable verdict for the plaintiff could unleash a flood of similar lawsuits against major social media companies, fundamentally altering the legal landscape for tech accountability.
  • The trial highlights growing regulatory and public pressure on tech firms to balance engagement-driven algorithms with safeguards for vulnerable user groups, particularly minors.

Meta and YouTube head to trial to defend against youth addiction, mental health harm claims | CNN Business

This article covers the upcoming trial in Los Angeles where Meta, YouTube, and TikTok face legal action over claims their platforms caused mental health harm to a teenage girl identified as KGM. The lawsuit alleges the companies intentionally designed addictive features that led to self-harm and suicidal thoughts. TikTok and Snap settled before trial, while Meta and YouTube will defend themselves before a jury. The case is part of a larger multi-district litigation involving over 1,000 similar personal injury claims against these platforms. The trial will test whether design features like endless scrolling and algorithmic recommendations—not just user content—contribute to mental health harms. Executives from Meta, TikTok, and YouTube are expected to testify. The companies have implemented various youth safety measures, including parental controls and content restrictions, but critics argue these measures remain insufficient. The outcome could influence how thousands of similar cases are resolved and may lead to new regulations or platform changes.

Key Takeaways

  • The trial marks the first time social media executives will testify before a jury about claims their platforms caused youth mental health harm through addictive design features.
  • Over 1,000 similar personal injury cases are consolidated in multi-district litigation, with this case serving as a bellwether for potential settlements or rulings.
  • Tech companies defend themselves by citing safety features and parental controls, while critics argue these measures don't fully address design-driven harms like endless scrolling and algorithmic recommendations.
  • The case could set precedent for future regulations, potentially forcing platforms to overhaul addictive design elements or face significant financial liability.
  • The trial highlights growing pressure on tech giants to address youth mental health concerns, mirroring scrutiny faced by tobacco companies in past public health lawsuits.

Social Media Use and Health and Well-being of Lesbian, Gay, Bisexual, Transgender, and Queer Youth: Systematic Review - PMC

This systematic review examines how LGBTQ youth (aged 10-24) use social media for peer connection, identity development, and social support, and how these uses impact mental health and well-being. The review analyzed 26 peer-reviewed studies (15 qualitative, 8 quantitative, 3 mixed methods) published from 2012 onward. Key findings show that social media provides safe spaces for LGBTQ youth to connect with peers, explore and manage identities, and access support networks. Platforms like Instagram, Tumblr, and Twitter were commonly used for anonymous connection and identity expression. While social media use was associated with reduced mental health concerns (anxiety, depression, paranoia) and increased well-being for many youth, negative experiences such as cyberbullying and platform toxicity were also reported. The review highlights the importance of social media as a resource for LGBTQ youth, particularly those in rural or unsupportive environments, but notes limitations in the evidence base and calls for more robust longitudinal studies to understand causal relationships.

Key Takeaways

  • Social media provides safe spaces for LGBTQ youth to connect with peers, explore identities, and access support networks, counteracting heteronormative environments
  • Platforms like Instagram, Tumblr, and Twitter are commonly used for anonymous connection and identity expression among LGBTQ youth
  • Social media use is associated with reduced mental health concerns (anxiety, depression) and increased well-being for many LGBTQ youth, though negative experiences like cyberbullying also occur
  • LGBTQ youth actively manage identity disclosure on social media through strategies like anonymity, privacy settings, and multiple accounts to avoid stigma and discrimination
  • The review identifies gaps in the evidence base and calls for more robust longitudinal studies to determine causal links between social media use and mental health outcomes

Explainer: Proposed Social Media Ban for Under-16s in Australia | Australian Human Rights Commission

The Australian Government is proposing laws requiring technology companies to restrict individuals under 16 from accessing social media platforms. While aimed at protecting children from online harms like cyberbullying, harmful content, and data exploitation, the Australian Human Rights Commission raises serious concerns about the proposal's human rights implications. The ban could limit fundamental rights including freedom of expression, access to information, association, education, cultural participation, health information access, and privacy. International human rights frameworks like the Convention on the Rights of the Child emphasize that any restrictions must be lawful, necessary, and proportionate, using the least restrictive means available. The Commission argues that blanket bans risk isolating young people, particularly marginalized groups, and may create privacy risks through mandatory age verification systems. Alternatives suggested include imposing a legal duty of care on social media companies and enhancing digital literacy education in schools to help children navigate online spaces safely. The Commission has developed a Child Rights Impact Assessment tool to evaluate how proposed laws affect children's rights and wellbeing.

Key Takeaways

  • A social media ban for under-16s in Australia raises significant human rights concerns regarding freedom of expression, access to information, association, education, cultural participation, health information, and privacy.

This country banned social media for young teens. Here’s how they’re defying it. - The Washington Post

Australia implemented history's first nationwide ban on social media for users under 16, taking effect overnight with broad parliamentary and public support. Prime Minister Anthony Albanese's center-left government passed the law in November 2024 after claiming extensive consultation, though it faced opposition from independent lawmakers and the Australian Greens. The legislation targets major platforms including YouTube, Twitch, TikTok and Instagram, requiring tech companies to take 'reasonable steps' to verify users' ages without explicit benchmarks. Despite official backing, compliance faces significant hurdles. Most teens interviewed by The Washington Post intend to circumvent the ban through various methods: creating new accounts with false ages, using parents' accounts, employing VPNs, or migrating to permitted apps like Roblox and Discord. Sixteen-year-old Mariska Adams and peers view the ban as an unreasonable restriction on their social connectivity, arguing it won't solve underlying issues. Parents express mixed reactions. Some, like Dany Elachi of the Heads Up Alliance, welcome the ban as a tool to foster real-world interactions and expand social circles. Others, such as Amanda Oliver, acknowledge the challenges of transitioning their pre-teens to offline activities. A significant minority, including Melissa Di Vita, oppose the ban as government overreach and plan to help their children bypass it, fearing increased secrecy if monitoring is strict. The law raises privacy concerns as it requires additional age verification data from users, prompting adult Australians to discuss potential surveillance risks. Digital rights groups criticize the approach as possibly violating rights to expression, privacy and education. Rural and vulnerable groups, such as boarding school students and LGBTQ+ teens, may lose vital social connections during isolation periods. Tech giants Meta, TikTok and Snap have stated they'll comply but disagree with the policy. Verification methods will likely combine account registration data with activity patterns, though workarounds remain plausible. The experiment serves as a global test case for balancing youth protection with digital rights and parental autonomy.

Key Takeaways

  • The ban faces near-universal teen resistance, with most planning circumvention strategies ranging from false identities to parental account sharing
  • While supported by most Australians and lawmakers, the law triggers privacy concerns about increased data collection and potential government surveillance
  • Parents are divided: some see it as enabling real-world interactions, while others view it as overreach that may push teens to more secretive online behavior
  • Implementation challenges include undefined verification standards, potential migration to unregulated platforms, and special considerations for rural and vulnerable youth
  • The policy serves as a global precedent that will test the balance between protecting children online and preserving digital rights and parental autonomy

Social media addiction lawsuits head to trial - The Washington Post

Parents are taking legal action against Meta, TikTok, and YouTube claiming social media design features are addictive and have caused severe mental health issues in teens. The first high-profile case going to trial in Los Angeles focuses on a 19-year-old plaintiff who alleges anxiety, depression, self-harm, and suicidality resulted from childhood social media use across multiple platforms. These lawsuits, consolidated in California federal court, argue tech companies designed algorithms to maximize teen engagement through features like infinite scroll, autoplay, notifications, and like counts while downplaying known harms. Tech companies deny responsibility, stating they provide parental controls and youth safety tools, and claim user-generated content—not platform design—is the real issue. The cases echo 1990s tobacco lawsuits and could force platform design changes if successful. Experts note scientific literature shows correlation—not causation—between social media use and mental health issues, with some studies suggesting moderate use may actually benefit teens. Internal company research cited in filings, including Meta's halted Project Mercury study, reportedly shows reduced depression in users who stopped Facebook use. The trial will test whether "addiction" can be legally proven and whether platforms can be held liable for psychological harm from their product designs.

Key Takeaways

  • The lawsuits challenge whether social media platforms can be held legally responsible for teen mental health issues through product design features that maximize engagement
  • Unlike tobacco cases, proving causation between platform design and addiction faces scientific and legal hurdles—current research shows correlation but not definitive causation
  • If successful, verdicts could force major design changes to algorithms and features like infinite scroll, potentially reshaping the entire social media industry
  • The cases highlight internal company research that reportedly shows reduced mental health symptoms when teens stop using platforms, though companies claim these findings were misinterpreted
  • Experts emphasize the complexity of teen mental health—some studies suggest moderate social media use may actually benefit mental wellbeing, contradicting simple 'addiction' narratives

Complaint Against Meta Platforms for Exploiting Young Users

This legal complaint filed by multiple state attorneys general accuses Meta Platforms (including Instagram, Facebook, and related subsidiaries) of systematically exploiting young users for profit through psychologically manipulative design features. The complaint details how Meta's business model prioritizes maximizing user engagement—particularly among teenagers and children—through addictive algorithms, endless scrolling, push notifications, and other features that exploit developing brains. Meta is accused of falsely representing that its platforms are safe for young users while internally acknowledging the severe mental and physical harms caused by prolonged use. The complaint highlights Meta's violation of the Children's Online Privacy Protection Act (COPPA) by collecting personal data from users under 13 without parental consent. It also documents Meta's awareness of the links between its platforms and increased depression, anxiety, sleep disturbances, and suicidal ideation among young users, citing internal research and external studies. The lawsuit seeks injunctive relief to stop Meta's allegedly unlawful practices and demands accountability for the widespread damage to youth mental health.

Key Takeaways

  • Meta's business model intentionally targets young users, exploiting their psychological vulnerabilities to maximize engagement and ad revenue.
  • The complaint reveals Meta's internal research confirming that its platforms contribute to depression, anxiety, sleep problems, and suicidal ideation among teens.
  • Meta violates COPPA by collecting personal data from children under 13 without parental consent, despite marketing its platforms to this age group.
  • The lawsuit highlights Meta's deceptive practices, including misleading public statements about platform safety and the prevalence of harmful content.
  • Design features like infinite scroll, push notifications, and recommendation algorithms are engineered to be addictive and are known to interfere with education, sleep, and mental health.

Meta Sued Over Features That Hook Children to Instagram, Facebook - The New York Times

Meta Platforms Inc. faces legal action from over 40 U.S. states over allegations that Instagram and Facebook deliberately used psychologically manipulative features to addict children and teenagers to its platforms. The coordinated lawsuit led by Colorado and California accuses Meta of violating consumer protection laws by designing features like "infinite scroll" and persistent alerts to compel extended use among young users, while also unlawfully collecting personal data from children under 13 without parental consent. The states argue these design choices prioritized profit over public health, comparing Meta's practices to historical corporate malfeasance seen in Big Tobacco and Big Pharma cases. The complaint details how Meta's algorithms push users into harmful content loops and seeks both financial penalties and injunctive relief to force changes in platform design. Meta defended its actions, stating it has implemented over 30 tools to support teen safety and expressing disappointment that attorneys general pursued litigation rather than collaborative industry standards. The case reflects growing global regulatory pressure on social media companies regarding youth safety, with similar actions underway against TikTok. Investigations began after Facebook paused development of "Instagram Kids" in 2021 amid backlash, intensified by leaked internal research showing awareness of platform harms to teen mental health.

Key Takeaways

  • The lawsuit represents unprecedented state-level coordination against a tech company, signaling heightened regulatory scrutiny of youth-targeted platform design

Snap Settles Lawsuit on Social Media Addiction, Avoiding a Landmark Trial - The New York Times

Snap has settled a landmark social media addiction lawsuit just before a scheduled trial, avoiding what would have been the first courtroom test of whether tech companies can be held liable for designing products that allegedly addict young users. The case, brought by teenager K.G.M., claimed that Snap's features like infinite scroll and algorithmic recommendations led to compulsive use and mental health issues including depression and self-harm. While Snap's settlement terms remain undisclosed, the case represents the first of several similar lawsuits against major platforms including Meta, TikTok, and YouTube that are set to go to trial this year. Plaintiffs are using a legal strategy modeled after successful Big Tobacco litigation, arguing that social media platforms are inherently defective products that caused personal injuries to millions of young users. They plan to introduce internal executive communications showing awareness of teen mental health impacts while allegedly taking minimal action to curb excessive use. The lawsuits seek both financial damages and design changes to platforms, with states and school districts also claiming costs for mental health services. Snap, TikTok, Meta, and YouTube have all denied creating addictive products and plan to argue no scientific link exists between social media use and addiction, while also claiming First Amendment protections for their platforms.

Key Takeaways

  • The Snap settlement represents the first major resolution in a wave of 'social media addiction' lawsuits using Big Tobacco-style liability arguments against tech companies
  • These cases could establish new legal precedents for holding tech platforms accountable through personal injury liability rather than just consumer protection claims
  • Plaintiffs plan to introduce internal executive communications showing awareness of teen mental health impacts, potentially exposing corporate knowledge similar to tobacco industry documents
  • The lawsuits seek both financial compensation and mandatory design changes to features like infinite scroll and algorithmic recommendations that enable compulsive use
  • States and school districts are claiming costs for mental health services as direct damages from social media addiction, expanding the scope of potential liability

Social Media Giants Face Landmark Legal Tests on Child Safety - The New York Times

Social media giants Meta, TikTok, Snap, and YouTube face landmark legal tests starting this week, with a series of trials examining whether their platforms caused personal injury through addictive product designs. The cases, inspired by successful tobacco litigation strategies, accuse these companies of creating products that encouraged excessive use by millions of young Americans, leading to mental health issues like anxiety, depression, and body-image disorders. The first trial, involving a now-20-year-old Californian identified as K.G.M., begins jury selection in Los Angeles Superior Court on Tuesday. K.G.M. claims she became addicted to social media as a child and suffered lasting psychological harm. The lawsuits pose significant legal threats to the companies, potentially exposing them to new liabilities and massive damages. Plaintiffs' lawyers plan to argue that features like infinite scroll, auto video play, and algorithmic recommendations lead to compulsive use and mental health issues. They will present internal company documents showing executives knew about the harms but prioritized profits. Meta, TikTok, Snap, and YouTube have defended themselves by citing Section 230 of the Communications Decency Act, which shields them from liability for user-generated content, and argue there is no scientific proof that social media causes addiction. Snap and TikTok have already settled with K.G.M., but the broader cases continue. Nine total cases are set for trial in Los Angeles, with federal cases brought by school districts and attorneys general scheduled for summer in Oakland. These cases could set new standards for how social media companies are held accountable for the well-being of their users, particularly children. The outcome may influence global regulations, as the EU, UK, Australia, and others have already implemented restrictions on social media features for minors.

Key Takeaways

  • The trials represent a major legal challenge for social media companies, potentially exposing them to unprecedented liabilities and damages.
  • Plaintiffs will use internal company documents to argue that executives knew about the harms of their products but prioritized profits.
  • The cases could set new standards for holding tech companies accountable for user well-being, particularly regarding minors.
  • The outcome may influence global regulations as other countries have already implemented restrictions on social media features for children.

Meta and YouTube Created ‘Digital Casinos,’ Lawyers Argue in Landmark Trial - The New York Times

A landmark trial has begun in Los Angeles where a 20-year-old woman is suing Meta and YouTube, alleging their platforms are intentionally designed to be addictive and caused her personal injury. Lawyers for the plaintiff, identified only as K.G.M., argue that features like endless scrolling, auto-play videos, and algorithmic recommendations function like 'digital casinos' that trap users, particularly children. They presented internal company documents showing executives knew about these risks since at least 2011, with one YouTube presentation comparing itself to a babysitter for children under four and another document explicitly labeling products as 'slot machines' where 'the house always wins.' Meta's defense claims K.G.M.'s mental health issues stem from family abuse and turmoil rather than social media use. This case is part of a wave of lawsuits inspired by Big Tobacco litigation, with similar cases pending against Snap, TikTok, and others. The trial highlights growing global concern about social media's impact on youth, with Australia having already banned under-16s from using these platforms and the EU implementing restrictions. The outcome could reshape how social media is designed and establish new liabilities for tech companies regarding user wellbeing.

Key Takeaways

  • This trial marks the first test of a legal theory comparing social media to casinos and tobacco, potentially setting precedents for future tech liability cases
  • Internal documents revealed show tech executives have known about addictive design features since at least 2011, contradicting public denials of harmful intent
  • The case highlights a global shift toward regulating social media for child safety, with Australia already implementing age bans and the EU passing restrictive laws
  • A victory for plaintiffs could trigger massive lawsuits and force fundamental changes to platform design, particularly around infinite scroll and algorithmic recommendations
  • The trial exposes a conflict between corporate priorities (growth and engagement) and user safety, especially for vulnerable young users

Opinion | We Didn’t Ask for This Internet - The New York Times

This New York Times Opinion podcast episode features a conversation between Ezra Klein, Cory Doctorow, and Tim Wu about the negative trajectory of the internet and potential solutions. The discussion centers on two key concepts: 'extraction' (Tim Wu's term for how platforms take value from users and businesses) and 'enshittification' (Cory Doctorow's term for how platforms degrade quality once users are locked in). The guests analyze how platforms like Facebook, Amazon, and others have evolved from providing valuable services to becoming extractive and manipulative systems that prioritize profit over user experience, small businesses, and fair labor practices. They discuss historical context, specific examples of platform degradation, and policy solutions including privacy rights, interoperability mandates, utility-style regulation, and breaking up monopolistic practices. The conversation also touches on the broader implications for democracy, inequality, and human attention, questioning whether competition alone can solve these problems or if deliberate policy interventions are needed to create a healthier digital ecosystem.

Key Takeaways

  • Platforms have shifted from providing value to users to extracting value from them through attention harvesting, data mining, and manipulative design, creating a system where user experience deteriorates once lock-in is achieved
  • The concept of 'enshittification' explains how platforms degrade quality over time by prioritizing business interests over user needs, often using algorithmic manipulation and surveillance to maintain engagement at all costs
  • Policy interventions like privacy rights, interoperability mandates, utility-style regulation, and antitrust enforcement are needed to counter extractive business models and create a healthier digital ecosystem that serves both users and businesses
  • The degradation of platform quality affects not just end users but also small businesses and workers, with platforms increasingly extracting value from sellers through fees and surveillance while exploiting labor through algorithmic management and lack of privacy protections
  • The conversation highlights the tension between market efficiency and human values, arguing that a purely competitive market approach fails to address the deeper societal harms of extractive platform practices and requires deliberate policy choices to protect user autonomy and democratic values

Utah to Require Parental Consent for Children to Use Social Media - WSJ

Utah Governor Spencer Cox signed two landmark bills regulating social media use by minors. Senate Bill 152 requires social media companies to verify users are 18+ before account creation, with parental consent and full parental access required for accounts under 18. House Bill 311 prohibits features that could cause social media addiction in minors and makes it easier for users to sue platforms. The laws will block minors from receiving messages from non-contacts, restrict search visibility, and impose a nightly 10:30 PM to 6:30 AM account lockout unless parents opt out. Ads will be prohibited on minor accounts. Meta Platforms expressed support for teen safety while highlighting existing parental control tools. The Electronic Frontier Foundation opposed the legislation, arguing it threatens user security and privacy. The bills were signed amid heightened national attention on social media's impact on youth mental health, with research showing nearly half of U.S. teens report being online "almost constantly." Federal lawmakers are also pursuing similar child protection measures nationwide.

Key Takeaways

  • Utah's new laws establish the first statewide framework requiring parental consent for minor social media accounts and implementing strict usage restrictions
  • The legislation addresses both account access (age verification) and usage patterns (nighttime blocks, limited contacts) to protect minors' mental health
  • Industry responses reveal tension between platform self-regulation and government intervention, with Meta supporting safety goals while civil liberties groups warn of privacy risks
  • The laws arrive during increased bipartisan focus on social media's effects on youth, with federal efforts also underway to protect children online
  • Research cited in the article highlights concerning trends in teen social media usage, with nearly half reporting constant online presence and many finding it difficult to disconnect

The Digital Wellness Lab’s Pulse Survey

This Pulse Survey from The Digital Wellness Lab examines how adolescents (ages 13-17) engage with media, their attitudes toward use, perceived effects on well-being, and online interactions. Key findings reveal that 94% of teens own smartphones, with average daily screen time of 8.2 hours. Most teens check their phones every 15 minutes or less and rarely go more than 12 hours without screens. YouTube, TikTok, Instagram, and Snapchat are the most used apps daily. While 32% report having no family rules about media use, those with rules focus more on content restrictions than time limits. Over half use screen time tracking apps, more common among girls. Teens report both positive and negative impacts of media use: social media enhances feelings of connection (79.4%) and support (69%), but also contributes to body image concerns (46%) and interferes with sleep (63.3%), family time (52%), and schoolwork (45%). Multitasking with devices is common (66%), and 54.6% feel it hurts attention. Physical symptoms like eye strain (47%), headaches (49.9%), back/neck pain (52.1%), and fatigue (57.1%) are frequently reported after media use. Teens primarily connect with known friends online (86% texting, 66% direct messaging), with 49.8% open to meeting online friends in person. Most are cautious with strangers online, avoiding sharing personal information, and value platform features like blocking and reporting for safety. Trust in online interactions is higher when introduced by known contacts or through verified accounts.

Key Takeaways

  • Teens are highly connected, with 94% owning smartphones and averaging 8.2 hours of daily screen time, often checking phones every 15 minutes or less.
  • While many families have no media use rules, those that do focus on content over time limits; girls are more likely to use screen time tracking apps than boys.
  • Social media positively impacts feelings of connection and support for most teens, but also contributes to body image issues and interferes with sleep, family time, and schoolwork.
  • Multitasking with devices is widespread and perceived to hurt attention and productivity; physical symptoms like eye strain, headaches, and fatigue are common after media use.
  • Teens are cautious online, primarily connecting with known friends, and value safety features like blocking and reporting; trust is built through known introductions or verified accounts.

Social Media and Youth Mental Health

This advisory from the U.S. Surgeon General examines the complex relationship between social media use and youth mental health. It highlights that while social media offers benefits like community connection and self-expression, there are significant risks including exposure to harmful content, cyberbullying, body image concerns, and disrupted sleep patterns. Key findings show that excessive use (over 3 hours daily) doubles the risk of depression and anxiety in adolescents. The advisory emphasizes that current evidence is insufficient to declare social media "safe" for children and adolescents, particularly given their vulnerable brain development during ages 10-19. It outlines specific harms: content exposure (suicide challenges, self-harm normalization, hate speech), problematic use patterns (addictive engagement features, sleep disruption), and predatory behaviors. The document calls for a multi-stakeholder approach: policymakers should establish age-appropriate safety standards and data privacy protections; technology companies must conduct independent impact assessments and prioritize safety in design; parents and caregivers need media literacy tools and family planning strategies; youth should develop healthy digital habits; and researchers must fill critical evidence gaps through longitudinal studies and standardized measures. The advisory advocates for applying a "safety-first" principle similar to consumer product regulations, requiring robust evidence of safety before widespread adoption among youth.

Key Takeaways

  • Social media use presents both benefits (community support, identity expression) and significant risks (mental health disorders, cyberbullying, sleep disruption) for adolescents, with effects varying by individual vulnerabilities and usage patterns
  • Current evidence is insufficient to declare social media 'safe' for children and adolescents, particularly given their ongoing brain development that makes them more susceptible to social pressures and reward-seeking behaviors
  • Key risks include exposure to harmful content (self-harm, hate speech, predatory behavior), social comparison leading to body image issues, and design features that encourage excessive use and disrupt healthy behaviors like sleep
  • A multi-stakeholder approach is needed: policymakers must create safety standards and data privacy protections; tech companies should prioritize safety in design and share impact data; parents need media literacy tools; youth should develop healthy digital habits; researchers must fill evidence gaps
  • The advisory advocates for a 'safety-first' approach similar to consumer product regulations, requiring robust evidence of safety before widespread adoption among youth, with independent assessments and transparent data sharing

Social Media Could Pose ‘Profound Risk of Harm’ to Young People’s Mental Health, Surgeon General Warns - WSJ

The U.S. surgeon general issued a public health advisory highlighting significant risks social media poses to adolescents' mental health. Dr. Vivek Murthy emphasized that while social media offers benefits like creative expression and community building, mounting evidence shows detrimental effects during critical brain development years (ages 10-19). Key findings include: 95% of teens use social platforms, with over a third using them "almost constantly." Research links excessive social media use (>3 hours/day) to doubled depression/anxiety risk in 12-15 year olds. Adverse effects examined include cyberbullying, exposure to self-harm content, sleep disruption, body image issues, and reduced physical activity. The advisory cites internal Meta research showing Instagram worsened body image concerns for one-third of teen girls. Recommendations target policymakers (stricter age verification, stronger data privacy), companies (transparent data sharing, rapid complaint response), and families (media literacy education). TikTok recently implemented 60-minute daily limits for under-18 users, while Meta has added privacy settings for teen accounts. Utah recently enacted laws requiring parental consent for under-18 social media accounts. The surgeon general stresses urgent action is needed as "our children don't have the luxury of waiting years" to understand full impacts.

Key Takeaways

  • Adolescent brain development makes teens particularly vulnerable to social media's mental health impacts during ages 10-19
  • Excessive use (>3 hours/day) correlates with doubled risk of depression/anxiety symptoms in early teens
  • Policy recommendations include mandatory age verification, enhanced data privacy, and company accountability measures
  • Platform-specific harms documented include cyberbullying, self-harm content exposure, sleep disruption, and body image issues
  • Urgent action needed as effects may take years to fully manifest - delay risks irreversible harm

Schools Sue Social-Media Platforms Over Alleged Harms to Students - WSJ

Nearly 200 U.S. school districts have joined federal litigation against Meta Platforms (Facebook), TikTok, Snap (Snapchat), and Alphabet (YouTube), alleging their platforms cause classroom disciplinary problems and mental-health issues among students. The lawsuits, consolidated in U.S. District Court in Oakland, California, claim social media apps create an addictive product that pushes destructive content to youth, diverting school resources from education. Districts report teachers and administrators waste time responding to cyberbullying, implementing new policies, and counseling students experiencing anxiety, depression, or suicidal thoughts linked to social media use. The cases face a critical challenge as the tech companies plan to dismiss them under Section 230 of the Communications Decency Act, which shields internet platforms from liability for third-party content. However, school districts and families argue the companies created an addictive product—not merely hosting user content—and thus are not protected by the law. The litigation strategy mirrors successful public nuisance lawsuits against vaping companies like Juul, which agreed to a $1.7 billion settlement after being accused of marketing addictive products to teens. Lawyers representing school districts have presented the cases to over 100 board meetings, with many districts joining to seek compensation for costs related to social media harms. Individual lawsuits by families, alleging defective product design and negligence, are also pending in the same court. These cases include parents of students who suffered severe anorexia and a teenager who died by suicide after posting “Russian roulette” videos on Snapchat. Some legal scholars question whether schools can prove direct harm, but the cases highlight growing pressure on social media companies to address youth safety.

Key Takeaways

  • School districts argue social media platforms create an addictive product that pushes harmful content to youth, distinguishing it from third-party user content protected by Section 230.
  • The litigation strategy mirrors successful public nuisance lawsuits against vaping companies like Juul, which paid $1.7 billion to settle claims of marketing addictive products to teens.
  • Individual lawsuits by families allege defective product design and negligence, seeking liability even if school districts' public nuisance claims are dismissed under Section 230.
  • Legal scholars question whether schools can prove direct harm from social media, but the cases highlight growing pressure on tech companies to address youth safety and mental health impacts.
  • The lawsuits could set a precedent for holding social media companies accountable for product design and algorithmic features that may contribute to youth addiction and mental health crises.

Landmark Trial Tests Claims That Social Media Harms Teens - WSJ

A landmark trial in Los Angeles is testing whether social media platforms like Instagram, TikTok, and YouTube caused mental health disorders in teens through addictive product designs. The case centers on a young woman (K.G.M.) who alleges that social media use led to body dysmorphia, suicidal thoughts, anxiety, addiction, and depression. This is the first of thousands of lawsuits against Meta, TikTok, Snap, and YouTube in California and federal courts, with plaintiffs arguing that algorithmic recommendations, infinite scroll, and video autoplay features make it difficult for teens to disengage. The companies deny liability, claiming they have invested in safety measures and that a victory for plaintiffs would undermine free expression. Key figures like Meta CEO Mark Zuckerberg and Instagram head Adam Mosseri are expected to testify. The trial uses a strategy similar to past cases against tobacco and pharmaceutical companies. Plaintiffs will present internal documents they claim show companies knew about addictive nature and potential harm. A major legal question involves whether Section 230 immunity applies – jurors will decide if harm came from third-party content or platform design. The case could influence settlements in other cases as both sides gauge strengths and weaknesses.

Key Takeaways

  • The trial marks the first test of whether social media platforms can be held liable for teen mental health harms through product design, using strategies from past tobacco and opioid cases
  • Plaintiffs will introduce internal company documents for the first time as evidence, potentially revealing what social media companies knew about addiction risks
  • A critical legal question centers on Section 230 immunity – jurors must decide if harm stems from user content or platform algorithms/design choices
  • The outcome could shape settlements in thousands of pending cases while public opinion increasingly favors restrictions on underage social media use
  • Companies argue safety investments make them liable, but plaintiffs contend these are insufficient compared to the scale of alleged harm

Mark Zuckerberg Testifies in California Social Media Addiction Trial - WSJ

Mark Zuckerberg testified in a landmark Los Angeles trial where Meta Platforms faces allegations its platform features harm young users. The plaintiff, identified as K.G.M., claims social media use as a child led to severe mental health issues including body dysmorphia, suicidal thoughts, and depression. Zuckerberg admitted Meta no longer sets team goals for user time spent on its platforms, contrasting with a 2015 email where he targeted a 12% increase in user engagement. He defended Meta's policies on underage users, stating children under 13 are prohibited and removed when identified, though internal documents revealed an estimated 4 million underage users on Instagram in 2015, representing approximately 30% of U.S. children aged 10-12. A parallel case in New Mexico alleges Meta designed features that endangered children and failed to protect them from sexual exploitation. The trials center on Meta's business model of maximizing ad revenue through prolonged user engagement, with critics pointing to algorithmic recommendations, infinite scroll, and autoplay as mechanisms that make it difficult for teens to disengage. Meta reported record revenue of nearly $60 billion in its most recent quarter. Snapchat and TikTok settled with the plaintiff before the trial began, while the case is expected to last six weeks.

Key Takeaways

  • Meta's shift from explicit user engagement targets to more nuanced growth strategies reflects growing scrutiny over platform design impacts on youth mental health
  • The revelation of millions of underage users on Instagram highlights systemic gaps in age verification and enforcement despite stated policies
  • These trials represent the first major legal challenges testing whether platforms can be held liable for algorithmic design choices that potentially encourage addictive usage patterns
  • Meta's defense strategy appears to focus on contrasting clinical addiction with casual usage while emphasizing prior safety investments
  • The cases could set precedents affecting how all major social platforms approach youth safety, algorithmic transparency, and content moderation

Frequently Asked Questions

  • How does the plaintiffs' legal strategy of targeting platform design features rather than user-generated content specifically circumvent Section 230 of the Communications Decency Act — and what are the three possible outcomes if the jury accepts or rejects this distinction?
  • What specific internal documents — including 'Project Myst,' the 2015 Zuckerberg time-spent email, and YouTube's 'slot machine' language — are the plaintiffs relying on to prove *intentionality*, and how does Meta's defense counter that these were taken out of context?
  • How does the JCCP 5255 bellwether structure work mechanically: if KGM wins, does that verdict bind the other 3,000+ plaintiffs, or does it only influence settlement negotiations and subsequent trials?
  • YouTube's attorney Luis Li argued that KGM's five-year average watch time was only 29 minutes per day and that she herself denied being addicted to YouTube under oath — how does Dr. Anna Lembke's addiction medicine testimony counter these specific data points?
  • The Australian Human Rights Commission raised 'serious reservations' about the under-16 social media ban, citing rights to freedom of expression, education, and privacy — how do these human rights objections compare to the legal arguments being made in the U.S. litigation about First Amendment protections for platform design?
  • The Digital Wellness Lab's Pulse Survey found that 79.4% of teens feel 'connected' and 69% feel 'supported' through social media, while the LGBTQ youth systematic review found net positive mental health outcomes for marginalized youth — how do these findings interact with the Surgeon General's advisory warning that over 3 hours of daily use doubles depression and anxiety risk?
  • Meta's 'Project Mercury' study (referenced in the Washington Post article) reportedly found that users who stopped using Facebook for a month reported lower depression and anxiety — but Meta halted the research, claiming results were biased. How does this internal study compare to the external academic literature cited by defendants that shows only correlation, not causation, between social media use and mental health outcomes?
  • How do the three parallel litigation tracks — the KGM personal injury bellwether in LA Superior Court, the New Mexico AG trial focused on child sexual exploitation, and the upcoming Oakland federal trial representing school districts arguing public nuisance — differ in their legal theories, remedies sought, and potential precedential impact?
  • Utah's H.B. 311 explicitly prohibits social media companies from 'using a design or feature that causes a minor to have an addiction' — how does this statutory language compare to the legal standard the KGM plaintiffs must meet in tort law to prove defective product design, and which standard is harder to satisfy?
  • Given that Snap and TikTok both settled before trial for undisclosed sums, what strategic calculus explains why Meta and YouTube chose to go to trial rather than settle — and how might the outcome of the KGM bellwether affect their decision-making for the remaining 3,000+ cases?