Newly unredacted court filings in a nationwide lawsuit brought by U.S. school districts offer a detailed look at how Meta allegedly handled internal research into the potential mental-health effects of its platforms. The documents, part of a broader case targeting Meta, Google, TikTok, and Snapchat, depict a company wrestling with troubling findings—and, according to plaintiffs, choosing not to pursue them.

The filings describe a 2020 Meta research effort known as Project Mercury, in which company researchers and outside firm Nielsen studied the effects of temporarily deactivating Facebook. According to internal summaries cited in the complaint, participants who logged off for a week reported decreased levels of depression, anxiety, loneliness, and social comparison. Those results, the documents say, surprised and disappointed Meta’s research team.

Rather than publish or expand on this work, the filing claims, Meta halted the project and reasoned internally that the results were shaped by “existing media narratives.” Yet at least one staff researcher allegedly pushed back, stating privately that the findings did show a “causal impact on social comparison,” while another employee warned that burying negative outcomes felt comparable to tobacco companies concealing evidence of harm.

Despite these internal assessments, the plaintiffs argue Meta continued to assure Congress that it lacked the ability to measure whether its products negatively affected teenagers—particularly teenage girls.

Meta spokesperson Andy Stone responded that the study was abandoned because its methodology was flawed, adding that the company has invested heavily for years in updating safety tools and consulting with parents and experts. “The full record will show that for over a decade, we have listened to parents, researched issues that matter most, and made real changes to protect teens,” he said.

Broader Allegations in the School District Lawsuit

The claim that Meta suppressed unfavorable research is one of several allegations laid out by the law firm Motley Rice, which filed an extensive brief late Friday on behalf of school districts around the country. The plaintiffs contend that the major platforms have long recognized risks to children and teens but publicly minimized or obscured them.

While TikTok, Google, and Snapchat did not comment on the new filing, the allegations directed at Meta are notably more detailed. Among the claims attributed to Meta’s own internal documents:

  • Youth safety tools were knowingly designed with limited effectiveness, and testing of potentially protective features was blocked over concerns about slowing growth.
  • Users attempting to traffic others for sex allegedly needed to violate rules repeatedly—up to 17 times—before removal, a threshold one document described as “very, very, very high.”
  • Efforts to increase teen engagement were known to result in more harmful content being surfaced to young users.
  • Proposals to restrict predators’ ability to contact minors were delayed for years amid worries about user-growth metrics.
  • Internal messages from 2021 reportedly show Mark Zuckerberg prioritizing the metaverse over child-safety work, despite requests from senior policy staff for increased investment in that area.

Stone rejected the characterizations, saying the plaintiffs rely on selective excerpts and misinterpreted opinions. He asserted that Meta’s safeguards for teens are “broadly effective,” and that the company now removes accounts immediately when they are flagged for sex-trafficking activity.

Beyond Meta, the filing also accuses other platforms of courting child-focused organizations to bolster public messaging about safety. In one cited example, TikTok allegedly viewed its sponsorship of the National PTA as a way to influence the group’s public stance.

What Comes Next

The internal Meta documents at the center of the filings are not currently public. Meta has asked the court to strike or limit what the plaintiffs seek to unseal, arguing that the request is overly broad rather than objecting to disclosure in principle.

A hearing on these issues is scheduled for January 26 in the Northern District of California. The outcome could determine how much of the companies’ internal research and safety deliberations ultimately become part of the public record—and how the broader debate over social-media harm evolves in the months ahead.