The appeal follows the emergence of a TikTok profile that featured videos of young women dressed in Polish national colours and promoting Poland’s withdrawal from the EU. The account gained traction in recent weeks before disappearing from the platform. Polish authorities say the material bears the hallmarks of coordinated disinformation and is almost certainly linked to Russia.
In a letter to the Commission, Deputy Digitalization Minister Dariusz Standerski warned that the content posed a serious risk to public order, information security, and democratic processes not only in Poland but across the EU. He argued that the narratives, their method of distribution, and the use of synthetic audio-visual techniques point to failures by TikTok to meet its obligations as a “Very Large Online Platform” under EU law.
A government spokesperson went further, saying the videos contained linguistic patterns consistent with Russian syntax, strengthening suspicions that the campaign originated outside Poland. Russia has repeatedly denied interfering in foreign elections or political debates.
TikTok said it had engaged with Polish authorities and removed content that violated its rules. “We have been in contact with Polish authorities and have removed content where it violates our policies,” a company spokesperson said. The European Commission and the Russian embassy in Warsaw did not immediately respond to requests for comment.
The issue comes amid heightened vigilance across the EU over foreign attempts to influence elections and domestic politics. Several member states have warned of increased Russian-sponsored disinformation, espionage, and sabotage activities, particularly online.
Last year, the European Commission opened formal proceedings against TikTok, owned by China’s ByteDance, over concerns that it failed to adequately prevent election interference, including during Romania’s presidential vote in November 2024.
Poland is now calling on the Commission to launch further proceedings under the Digital Services Act (DSA), the bloc’s sweeping regulation governing how major online platforms operate in Europe. Under the DSA, companies such as TikTok, X, and Facebook are required to swiftly moderate and remove harmful content, including hate speech, xenophobia, and disinformation. Failure to comply can result in fines of up to 6% of a company’s global annual turnover.
The case adds to growing pressure on social media firms to demonstrate that their safeguards can keep pace with increasingly sophisticated AI-generated content and coordinated influence campaigns.
