New Mexico Attorney General Raúl Torrez, a Democrat, alleges that Meta promoted illegal content and enabled the sexual exploitation of minors across Facebook, Instagram, and WhatsApp. The lawsuit claims the company allowed predators easy access to underage users and connected them with victims, sometimes leading to real-world abuse and human trafficking.
Jury selection is scheduled to begin Monday in Santa Fe District Court, with the trial expected to last seven to eight weeks. Meta denies the allegations, asserting it has extensive safeguards to protect younger users.
Undercover Operation Sparks Lawsuit
The case stems from a 2023 undercover operation called “Operation MetaPhile,” led by Torrez and his office. Investigators created Facebook and Instagram accounts posing as users under 14. According to the attorney general’s office, the accounts received sexually explicit material and were contacted by adults seeking similar content, resulting in criminal charges against three individuals.
New Mexico also argues that Meta designed its platforms to maximize engagement, despite evidence of harm to children’s mental health. The lawsuit highlights features like infinite scroll and auto-play videos, claiming they foster addictive behavior linked to depression, anxiety, and self-harm.
The complaint also alleges that internal Meta documents acknowledged problems with sexual exploitation and mental health harms, yet the company failed to implement basic safety tools such as age verification, and misrepresented platform safety.
New Mexico is seeking monetary damages and court-ordered changes to improve safety for children using Meta’s platforms.
Meta Calls Claims “Sensationalist”
A Meta spokesperson dismissed the lawsuit as “sensationalist, irrelevant and distracting,” saying the case relies on “cherry-picked documents.” The company said it has worked with parents, experts, and law enforcement for years and continues to improve its safety systems.
Meta has argued that it is protected from liability by the First Amendment and Section 230 of the Communications Decency Act, which generally shields online platforms from lawsuits over user-generated content. The company says the state’s allegations are inseparable from user content because its algorithms and design features serve to publish that content.
Broader Scrutiny and Ongoing Litigation
Meta has faced growing scrutiny over the safety of child and teen users, especially after whistleblower testimony before Congress in 2021 alleging the company knew its products could be harmful. In 2023, Reuters reported an internal policy allowed Meta’s chatbots to engage minors in romantic or sensual conversations. Meta confirmed the document but said it removed such language after the report.
Evidence related to the company’s AI chatbots is expected to be presented during the trial.
Meta is also battling thousands of other lawsuits accusing it and other social media companies of designing products to be addictive to young people, contributing to a national mental health crisis. Some of these suits seek damages in the tens of billions of dollars, according to Meta’s financial filings.
The first trial in that broader litigation began in Los Angeles earlier this week, with Alphabet’s Google and Meta remaining as defendants after TikTok and Snap settled with the plaintiff.
