South Korean officials announced on Wednesday their intention to request that Telegram and other social media platforms take more proactive measures in the removal and blocking of sexually explicit deepfake content, as part of a broader strategy to address this escalating issue.

This initiative follows widespread public and political backlash after various domestic media outlets revealed that sexually explicit deepfake images and videos of South Korean women frequently appeared in Telegram chatrooms.

Additionally, a 24-hour hotline for victims will be established, and the number of regulatory personnel overseeing digital sex crimes will be increased from the current 70.

The Korea Communications Standards Commission indicated that the Korean National Police Agency will also embark on a seven-month campaign to combat online sex crimes.

Furthermore, the media regulatory body plans to create a consultative group to improve collaboration with social media companies regarding the removal and blocking of sexual deepfake content, as stated by its chairman, Ryu Hee-lim, during a recent meeting on the topic.

For companies lacking a physical presence in South Korea, there is an intention to establish a direct communication channel for ongoing consultations. Ryu emphasized, “The creation, possession, and distribution of deepfake sexual crime videos constitute a grave offense that undermines individual dignity and personal rights.”

The commission indicated that, in addition to Telegram, it would seek collaboration with X, as well as Meta’s Facebook and Instagram, and Google’s YouTube.

However, none of these companies have responded to Reuters' request for comments.

The scrutiny of Telegram in South Korea has coincided with the recent arrest of Pavel Durov, the founder of Telegram, during a French investigation into child pornography, drug trafficking, and fraud associated with the encrypted messaging platform.

According to police data, the number of deepfake sexual crime incidents in South Korea has increased from 156 in 2021, when records began, to 297 this year, with the majority of offenders being teenagers.

The victims are predominantly female, including school students and women serving in the South Korean military.

This year, South Koreans have submitted over 6,400 requests to the Korea Communications Standards Commission for the removal of sexually explicit deepfake content, compared to nearly 7,200 cases last year where the commission agreed to assist in content removal.