Did Social Media Design Harm a Generation? Inside the California Trial Against Meta and Google

A California woman testifies that childhood use of Instagram and YouTube fueled anxiety, depression, and addiction-like behavior in a landmark trial against Meta and Google.

Social Media Mental Health Trial California
Testimony in a Los Angeles courtroom highlights growing global legal and regulatory pressure on social media giants over alleged harms to children and teens. Image: CH


Los Angeles, California, United States — February 27, 2026:

A California woman’s emotional courtroom testimony has placed renewed scrutiny on the design of social media platforms, as she accuses Meta Platforms’ Instagram and Google’s YouTube of contributing to years of anxiety, depression and compulsive use that began in childhood.

Known in court as Kaley G.M., the 20-year-old plaintiff told jurors in Los Angeles County Superior Court that she began using YouTube at age six and Instagram at nine. What started as casual engagement, she testified, evolved into an obsession that disrupted her sleep, schooling and in-person relationships. She described feeling unable to detach from her phone, even as online bullying and self-image pressures intensified. By age 10, she said, her mental health had deteriorated to suicidal thoughts and episodes of self-harm, though she never attempted suicide.

Her case forms part of a widening global backlash against social media companies over their alleged impact on children and teenagers. Australia has already banned under-16s from platforms such as Instagram and YouTube, while lawmakers in other countries are weighing similar restrictions. The trial unfolds against this broader regulatory push, amplifying its potential significance beyond a single plaintiff’s claims.

Kaley testified that when her mother confiscated her phone, she would experience intense anxiety and rage. Being offline, she said, triggered panic and a sense that “a huge part of me was missing.” Notably, she told jurors that the fear of missing out disturbed her more deeply than negative comments or harassment encountered online, underscoring the powerful emotional pull of constant connectivity.

The lawsuit, originally filed by her mother before she turned 18, alleges that Meta and Google knowingly engineered platform features to maximize engagement among young users despite internal awareness of potential mental health risks. Her attorneys cited an internal Meta study indicating that teenagers from difficult home environments are more likely to use Instagram habitually or unintentionally. They argue that features such as infinite scrolling, autoplay video, visible “like” counters and beauty filters were deliberately crafted to keep users hooked, often at the expense of self-esteem and psychological well-being.

Meta and Google deny wrongdoing and contend that scientific evidence does not support claims that their platforms substantially cause mental health disorders. Lawyers for YouTube argued that Kaley did not consistently use available safety tools, including options to delete comments or limit screen time. To prevail, her legal team must demonstrate that the platforms’ design or operation was a substantial factor in causing or worsening her mental health struggles.

The defense has also pointed to complex personal circumstances, including documented family conflict and abuse following her parents’ divorce, suggesting that multiple factors may have shaped her psychological trajectory. On the stand, Kaley acknowledged a complicated relationship with her mother but described her as both nurturing and loving, saying they remain close.

The trial, which began in late January, has also featured testimony from Kaley’s former psychotherapist, who linked excessive social media use to diagnoses of social phobia and body dysmorphic disorder during her early teens. It has drawn attention to what executives, including Meta CEO Mark Zuckerberg, knew about social media’s effects on children. Zuckerberg testified that while the company discussed youth-oriented products, it ultimately did not launch a platform specifically for children.

Beyond the personal narrative, the case could shape how courts interpret the responsibilities of technology firms in designing products used by minors. If jurors find that engagement-driven features materially contributed to harm, it may embolden further litigation and regulatory reform in the United States and abroad.

For now, the proceedings spotlight a fundamental question confronting the digital age: whether platforms built to capture attention can be held legally accountable when that attention becomes dependency.

Post a Comment

Previous Post Next Post

Contact Form