Meta executives shut down internal research after discovering “causal” evidence that Facebook and Instagram were harming users’ mental health, according to newly unredacted filings in a lawsuit brought by U.S. school districts.
The documents were obtained through discovery in a major class action targeting Meta, Google, TikTok, and Snapchat.
The unsealed docs detail how a 2020 internal research program, code-named Project Mercury, produced findings that directly contradicted Meta’s public claims.
Meta scientists partnered with Nielsen to test what happened when users deactivated Facebook and Instagram for one week.
The internal conclusion stunned company executives, revealing that users who logged off reported “lower feelings of depression, anxiety, loneliness, and social comparison,” according to Meta’s own analysis.
Rather than publish the results or expand the research, Meta quietly shut the project down.
The company later told employees the negative findings were contaminated by “the existing media narrative” surrounding Facebook, according to the filing.
Privately, Meta insiders knew the findings were real.
One staff researcher reportedly wrote, “The Nielsen study does show causal impact on social comparison,” adding an unhappy-face emoji.
Another staffer warned that burying the results would resemble the tobacco industry “doing research and knowing cigs were bad and then keeping that info to themselves.”
Despite possessing its own internal evidence of harm, Meta told Congress it had “no ability” to measure whether its products were damaging teenagers, including teen girls, the filing alleges.
Meta spokesman Andy Stone disputed the claims Saturday, arguing the study was halted due to flawed methodology.
“The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens,” he said.
The newly unsealed material forms part of a sweeping lawsuit filed Friday by the law firm Motley Rice on behalf of school districts nationwide.
The suit accuses Meta and other platforms of concealing known risks, encouraging under-13 users to engage with their products, failing to address child exploitation, and attempting to influence child-focused organizations with money.
The allegations against Meta are the most extensive and the most damaging.
According to internal documents cited in the filing, Meta:
• Intentionally designed youth safety tools to be ineffective, and blocked testing of features that might slow user growth.
• Allowed users caught attempting sex-trafficking 17 times before removing them — a threshold one internal document described as “a very, very, very high strike threshold.”
• Recognized that optimizing for teen engagement served more harmful content, and moved forward anyway.
• Delayed efforts to limit child predators’ ability to contact minors, citing concerns about growth metrics.
• And according to a 2021 text message included in the filing, Mark Zuckerberg wrote that he wouldn’t say child safety was his top concern “when I have a number of other areas I’m more focused on like building the metaverse.”
The filing also says Zuckerberg “shot down or ignored” requests from then–global policy head Nick Clegg for increased child-safety investment.
Stone rejected all these claims. He said Meta’s teen-safety systems “are broadly effective,” emphasized that the company now removes accounts immediately when flagged for trafficking, and insisted plaintiffs were relying on “cherry-picked quotes and misinformed opinions.”
None of the underlying internal documents has been made public, and Meta is actively fighting their release.
A hearing on the matter is scheduled for January 26 in the Northern District of California.
It is setting the stage for what could become the largest legal showdown ever over social media’s impact on children and teens.

Our comment section is restricted to members of the Slay News community only.
To join, create a free account HERE.
If you are already a member, log in HERE.