The probe, announced Thursday, is being conducted under the European Union’s Digital Services Act (DSA), a regulatory framework that requires large online platforms to implement robust safeguards against illegal and harmful content. Noncompliance could result in fines of up to 6% of a company’s global annual revenue.
Henna Virkkunen, the EU’s Commissioner for Technology, said the investigation follows evidence that Snapchat has “overlooked that the Digital Services Act demands high safety standards for all users.” She cited issues ranging from child grooming and exposure to illegal products to account settings that may compromise the safety of minors.
The European Commission, which oversees enforcement of the DSA, indicated that Snapchat’s content moderation tools may be insufficient in preventing users from promoting or facilitating the sale of illegal products, including drugs, as well as age-restricted items such as vapes and alcohol. The Commission also noted concerns with Snapchat’s age verification process, default account settings, and reporting mechanisms for harmful content.
This investigation builds on a case originally opened by Dutch regulators in September, which focused on the sale of vapes to children through the platform. The European Commission will now take over and expand the inquiry to broader aspects of child protection and illegal content enforcement.
Snapchat responded by emphasizing its commitment to user safety. A spokesperson said, “We have fully cooperated with the Commission to date—engaging proactively, transparently and working in good faith to meet the DSA’s high safety standards—and we will continue to do so throughout this investigation.”
The investigation highlights growing regulatory pressure on social media platforms across Europe, where authorities are increasingly scrutinizing how technology companies protect minors and curb illegal commerce on their platforms.
