NextFin

Adam Mosseri Asserts Social Media Not Clinically Addictive Amid Landmark Liability Trial

Summarized by NextFin AI
  • Instagram CEO Adam Mosseri testified that social media is not clinically addictive, defending Meta against claims of fostering compulsive behavior in minors.
  • The lawsuit involves a 20-year-old woman alleging Instagram's design features exacerbated her mental health issues, with internal documents suggesting Meta's awareness of its platform's impact.
  • The trial's outcome could have significant financial and regulatory implications for Meta and Google, potentially leading to industry-wide design changes and increased federal oversight.
  • Critics argue that current safety measures are superficial, and a loss for Meta could reshape the attention economy, impacting social media stocks.

NextFin News - In a high-stakes legal confrontation that could redefine the liability of Silicon Valley’s most powerful platforms, Instagram CEO Adam Mosseri testified on Wednesday, February 11, 2026, that social media is not “clinically addictive.” Appearing in a Los Angeles Superior Court, Mosseri defended Meta Platforms Inc. against allegations that its products were intentionally designed to foster compulsive behavior in minors, leading to severe mental health crises. The testimony marks a critical juncture in the first of several “bellwether” trials involving hundreds of lawsuits from families, schools, and state attorneys general who liken social media platforms to “digital casinos.”

The trial centers on a lawsuit filed by a 20-year-old California woman, identified as K.G.M., who alleges that Instagram and YouTube’s addictive design features—including infinite scrolling and algorithmic recommendations—exacerbated her depression and body dysmorphia. According to The Guardian, Mosseri argued on the witness stand that a distinction must be made between “clinical addiction,” which lacks an official diagnostic status for social media, and “problematic use.” During cross-examination, plaintiff’s attorney Mark Lanier presented internal Meta documents where employees reportedly referred to Instagram as a “drug” and themselves as “pushers,” suggesting that executives were aware of the platform’s biological and psychological grip on young users.

Mosseri’s defense strategy hinges on the lack of scientific consensus regarding social media addiction as a formal medical diagnosis. While the World Health Organization recognizes “gaming disorder,” social media addiction remains absent from the Diagnostic and Statistical Manual of Mental Disorders (DSM-5). By leaning into this clinical ambiguity, Mosseri seeks to shield Meta from the massive financial penalties that would follow if the court were to classify the platform as a defective or inherently dangerous product. According to Bloomberg, Mosseri acknowledged that some users experience compulsive usage patterns but compared it to “binge-watching a Netflix show,” framing the issue as a matter of personal habit rather than corporate negligence.

The financial and regulatory implications of this trial are profound. If the jury finds Meta and Google liable, it could trigger a wave of settlements and court-ordered design changes across the industry. Currently, tech giants are navigating a tightening global regulatory environment; U.S. President Trump’s administration has maintained a complex stance on Big Tech, balancing a desire for deregulation with populist concerns over child safety and platform bias. The outcome in Los Angeles may provide the legal ammunition needed for more aggressive federal oversight or even the repeal of Section 230 protections, which currently shield platforms from liability for third-party content but are increasingly being challenged on the grounds of “product design.”

From an industry perspective, Mosseri’s testimony reflects a broader pivot toward “safety-by-design” as a defensive posture. In late 2024 and throughout 2025, Instagram introduced “Teen Accounts” with built-in muting for notifications and restricted content. However, critics argue these are superficial fixes. A 2025 review by the nonprofit Fairplay found that nearly two-thirds of these safety tools were ineffective or non-functional. The tension between growth-driven algorithms and user well-being remains the core conflict; internal emails revealed during the trial showed that Mosseri and his team hesitated to ban certain “plastic surgery” filters in 2019 out of fear that users would migrate to competitors like Snapchat or TikTok.

Looking forward, the “Big Tobacco moment” for social media appears to have arrived. Just as the tobacco industry once argued that nicotine was not addictive, the tech industry is now fighting a war of definitions. If the court accepts the plaintiff’s argument that “infinite scroll” and “variable rewards” (likes and notifications) are the digital equivalent of slot machine mechanics, the legal precedent will force a total overhaul of the attention economy. Investors are closely watching the proceedings, as a loss for Meta could lead to a significant re-rating of social media stocks, reflecting the new costs of compliance and the potential loss of user engagement as platforms are forced to become “less addictive” by law.

Explore more exclusive insights at nextfin.ai.

Search
NextFinNextFin
NextFin.Al
No Noise, only Signal.
Open App