Did Meta Bury Evidence That Facebook and Instagram Harm Users’ Mental Health?

Allegations that Meta suppressed internal research showing Facebook and Instagram may harm users intensify scrutiny as U.S. school districts pursue a major lawsuit.

Meta Suppressed Mental-Health Research
A lawsuit accuses Meta of suppressing internal research linking Facebook and Instagram to teen mental-health harms, fueling questions about the company’s safety practices and transparency. Image: CH



SAN FRANCISCO, United States — November 23, 2025:

Newly unsealed court filings have intensified scrutiny over Meta’s handling of internal mental-health research, describing claims that the company shut down a 2020 study after its findings suggested Facebook and Instagram could harm users’ well-being. This controversy sits at the heart of a sweeping lawsuit brought by school districts across the United States against several major social media companies, including Meta, Google, TikTok, and Snapchat.

The central allegation involves “Project Mercury,” a Meta research initiative conducted with Nielsen. The study examined what happened when users deactivated Facebook and Instagram for a week. Internal documents cited in the lawsuit show that participants experienced lower depression, anxiety, loneliness, and social comparison—results that provided rare causal evidence in a field where such clarity is uncommon. Rather than pursue or publish these results, Meta is accused of halting the project, with executives reportedly dismissing the findings as skewed by negative public narratives surrounding the company. Some employees pushed back internally, defending the study’s validity. One researcher said the study did reflect causal impacts on social comparison, while another invoked comparisons to tobacco-industry suppression of harmful research.

Despite these internal insights, the filing argues that Meta later told Congress it could not determine whether its platforms harmed teenage girls, raising questions about transparency and accountability. Meta spokesperson Andy Stone has rejected the allegations, asserting that the study’s methodology was flawed and that the company has long invested in teen-safety measures. Stone maintained that Meta has consistently listened to parents, consulted experts, and implemented meaningful protections on its platforms.

The legal action, spearheaded by law firm Motley Rice, presents a broader critique of the tech industry’s treatment of child and teen users. It claims that multiple social platforms concealed risks, facilitated underage use, overlooked child sexual-abuse content, and encouraged teen engagement during school hours. While TikTok, Google, and Snapchat also face accusations, the filings provide far more detailed documentation regarding Meta. Among the most serious claims are that Meta’s safety features for young users were ineffective and rarely used; that internal enforcement thresholds for severe violations were set at extraordinarily high levels; that engagement-maximizing algorithms knowingly exposed teens to more harmful material; and that leadership deprioritized child safety while focusing on growth and the development of the metaverse.

The lawsuit also asserts that social platforms sought to influence child-focused organizations. It cites TikTok’s sponsorship of the National PTA as an example of corporate efforts to shape public-facing messaging. Though similar allegations appear against several companies, the case presents Meta as the most extensively documented and possibly the most central to the broader narrative of concealed risk.

Meta strongly disputes these characterizations. The company has moved to strike the internal documents referenced in the filing, arguing that the request to unseal them is overly broad and does not reflect the full context of its work on user safety. A key hearing on this issue is scheduled for January 26, 2026, in the Northern California District Court, which will determine how much of Meta’s internal research enters the public record and influence the trajectory of the case.

The controversy deepens ongoing debates about the responsibilities of tech companies in shaping the digital environments young people inhabit. If the allegations are corroborated, Meta may face not only legal consequences but also a renewed erosion of public trust. For now, the filings highlight a widening gap between the company’s internal findings and its public assurances, raising critical questions about platform accountability in the digital age.

Post a Comment

Previous Post Next Post

Contact Form