South Korea announced plans on Wednesday to request greater cooperation from Telegram and other social media platforms in curbing the spread of sexually explicit deepfake content. This initiative is part of broader efforts to address the escalating issue of digital sex crimes.
The decision follows widespread public outrage after reports surfaced that explicit deepfake images and videos of South Korean women were frequently being shared in Telegram chatrooms. In response, the Korea Communications Standards Commission (KCSC) revealed that it will establish a 24-hour hotline for victims and double the number of regulatory personnel monitoring digital sex crimes, increasing the team from 70 to 140 members.
The Korean National Police Agency also plans to launch a seven-month campaign focused on cracking down on online sex crimes. As part of these efforts, the KCSC intends to create a consultative body to improve communication with social media companies regarding the removal and blocking of deepfake content. KCSC Chairman Ryu Hee-lim emphasized the gravity of the issue, stating, “The creation, possession, and distribution of deepfake sex crime videos are serious offenses that violate individual dignity and personal rights.”
In addition to engaging with Telegram, South Korea will seek collaboration from platforms like X (formerly Twitter), Meta’s Facebook and Instagram, and Google’s YouTube. For companies without a physical presence in South Korea, the KCSC plans to establish direct communication channels for ongoing consultation.
The move comes as criticism of Telegram in South Korea intensifies, coinciding with the recent arrest of Telegram’s founder, Pavel Durov, in France as part of an investigation into child pornography, drug trafficking, and fraud on the encrypted messaging platform.
The number of deepfake sex crime cases in South Korea has risen sharply, with police data showing an increase from 156 cases in 2021 to 297 so far this year. Most offenders are reported to be teenagers, while victims, primarily female, include school students and female soldiers.
This year alone, over 6,400 requests have been made to the KCSC for assistance in removing sexually explicit deepfake content, compared to nearly 7,200 cases in which the commission intervened last year.