A landmark court case over alleged social media addiction begins Tuesday in California. Senior technology executives are expected to testify. The trial could reshape legal accountability for digital platforms.
The plaintiff is a 19-year-old woman known as KGM. She argues platform algorithms caused addiction and damaged her mental health. She says the designs encouraged compulsive use during her teenage years.
The defendants include Meta, owner of Instagram and Facebook, TikTok owner ByteDance, and YouTube parent Google. Snapchat reached a settlement with the plaintiff last week. The remaining companies now face trial.
The case will unfold at Los Angeles Superior Court. Observers see it as the first in a larger wave of similar lawsuits. These cases could weaken a long-standing legal defence for technology firms.
Algorithms and design choices take centre stage
The companies say the evidence fails to prove responsibility for depression or eating disorders. They argue no direct link exists between their products and the alleged harms.
The decision to proceed to trial signals a shift in the legal landscape. Courts increasingly examine claims that digital products promote addictive behaviour. Pressure on technology firms continues to build.
For decades, companies relied on Section 230 of the Communications Decency Act. Congress passed the law in 1996 to protect platforms from liability over user content.
This lawsuit targets different issues. It focuses on algorithms, notifications, and engagement features. These design choices influence how users interact with apps.
KGM’s lawyer, Matthew Bergman, called the trial a legal first. He said a jury will finally judge social media companies over their conduct.
He said many young people worldwide suffer similar harm. He accused platforms of prioritising profits over children’s wellbeing.
High stakes for the technology sector
Eric Goldman, a law professor at Santa Clara University, described the risks as severe. He warned losses in court could threaten the companies’ survival.
He also highlighted legal challenges for plaintiffs. Courts rarely attribute physical or psychological harm to content publishers.
Still, he said these cases opened new legal questions. Existing laws never anticipated claims focused on digital product design.
Evidence, documents, and executive testimony
Jurors will hear extensive testimony during the trial. They will also review internal company documents.
Mary Graw Leary, a law professor at Catholic University of America, expects significant disclosures. She said companies may reveal information long hidden from public view.
Meta previously said it introduced dozens of safety tools for teenagers. Some researchers question whether those measures work.
The companies plan to argue third-party users caused the alleged harm. They deny their designs directly injured young users.
Meta chief executive Mark Zuckerberg will testify early in the trial. His appearance ranks among the most anticipated moments.
In 2024, Zuckerberg told US senators scientific studies showed no proven causal link. He said research failed to connect social media with worse youth mental health.
During that hearing, he apologised to victims and their families. Lawmakers pressed him during emotional exchanges.
Global scrutiny intensifies
Mary Anne Franks, a law professor at George Washington University, questioned executive testimony strategies. She said technology leaders often perform poorly under pressure.
She added companies strongly hoped to avoid placing top executives on the stand. Public testimony carries major reputational risk.
The trial comes as scrutiny grows worldwide. Families, school districts, and prosecutors increasingly challenge social media practices.
Last year, dozens of US states sued Meta. They accused the company of misleading the public about platform risks.
Australia has banned social media use for children under 16. The UK signalled in January it may follow.
Franks said society has reached a tipping point. She argued governments no longer treat the technology industry with automatic deference.
