A legal battle over children’s safety online is unfolding in Italy, where a parents’ advocacy group and several families have launched court action against Meta and TikTok, demanding stricter controls on minors using social media platforms.

The case, heard Thursday before Milan’s business court, marks one of the latest European efforts aimed at tightening regulations around how technology companies interact with young users online.

The lawsuit was filed by MOIGE — a prominent Italian parents’ movement — alongside a number of families. The action targets the companies behind Facebook, Instagram, and TikTok over concerns that millions of underage children are accessing platforms illegally and being exposed to potentially harmful content and engagement systems.

At the center of the case is a demand for stronger age-verification systems capable of preventing children under 14 from creating or using accounts.

The plaintiffs are also asking the court to compel the platforms to remove what they describe as manipulative algorithms designed to maximise user engagement, while providing clearer warnings about the mental and behavioural risks linked to excessive social media use.

According to MOIGE, nearly 3.5 million Italian children between the ages of 7 and 14 are currently active on social media despite age restrictions already in place.

TikTok and Meta Defend Safety Measures

Responding to the lawsuit, TikTok said the legal process was still ongoing but insisted it already enforces strict safety standards.

“We apply our Community Guidelines rigorously, including those aimed at protecting mental and behavioural health, and proactively remove more than 99% of content that violates them,” a TikTok spokesperson said.

The company added that it continues investing in tools intended to protect younger users, including systems that diversify recommended content, block harmful search results, and connect vulnerable users to support resources.

Meta also rejected the allegations raised in court.

“We know parents worry about the safety of their teens online, which is why we’re consistently making changes to help protect teens,” the company said in a statement.

The social media giant pointed to its Teen Accounts initiative and existing safeguards designed for younger users, adding: “We stand by our record and will continue to do more to keep young people safe.”

Jurisdiction Dispute Emerges in Court

According to MOIGE, lawyers representing Meta and TikTok challenged whether Italian courts have the authority to rule on how global technology companies manage their platforms.

The companies reportedly raised objections over jurisdiction and also disputed fresh evidence submitted by the plaintiffs.

MOIGE’s legal team claims the newly presented documents indicate the companies were aware of the potentially harmful psychological effects their recommendation algorithms could have on minors, particularly features allegedly designed to increase screen time and user dependency.

Lawyers for the parents’ group argued that the matter should be treated as a public health issue and urged judges to fast-track proceedings because of the alleged risks facing children.

The Milan court is expected to announce a timetable for future hearings at a later stage.

Europe Intensifies Pressure on Social Media Firms

The Italian lawsuit comes as governments and regulators across Europe move aggressively to tighten rules surrounding children’s online safety.

Earlier this week, Ursula von der Leyen said the European Union’s upcoming Digital Fairness Act would target addictive and harmful design practices used by social media companies.

Countries including Australia, France, Greece, and Spain have also proposed or introduced stricter measures aimed at limiting minors’ access to social media platforms.

Spain, in particular, announced plans earlier this year to ban social media use among teenagers as concerns continue to grow over online addiction, mental health issues, and digital safety for children.