“Technologies that generate images of abuse must be completely banned”
International organizations issue a statement···In Korea, the Broadcasting Media Communications Deliberation Commission participates
“Build in safety from the design stage···Protective measures must be introduced”
Upon finding sexual exploitation outputs, “Report·Condemn·Shut down”
Grok and images. Getty Images-AFP Yonhap News
On the 10th, news of the release of the video artificial intelligence (AI) Sidance 2.0, touted as having “completed a two-minute movie for 70,000 won,” drew attention. Interest centered on how precisely image and video AI can produce results that match user intent. That same day, however, voices also emerged worrying about another use of image AI: a statement from international organizations around the world that AI must not be misused to create sexual exploitation material.
International organizations that have worked to counter the spread of child sexual exploitation materialthe International Association of Internet Hotlines (INHOPE), Safe Online, the Internet Watch Foundation (IWF), among othersissued a statement calling for a complete ban on technologies that enable AI to generate images of abuse and nudity (non-consensual body synthesis). In Korea, the Broadcasting Media Communications Deliberation Commission (Bangmi-Simwi) joined the statement.
The groups said, “Non-consensual body synthesis (nudify) tools generate nude images of others without consent, and the creation of illegal images of children is also becoming more frequent,” adding, “Companies, developers, and individuals who create or distribute these must be held accountable and face legal and criminal action.”
There are warnings that image AI has become “a new pathway of exploitation,” beyond a mere tool. They pointed out, “AI tools mean offenders do not even need to obtain photos of victims,” and “sexual exploitation images can be manufactured artificially, chillingly efficiently, and at scale.” They also noted, “Perpetrators, including some adolescents, have monetized AI-generated sexual exploitation images, creating a new ‘exploitation economy.’”
In a world where anyone can easily use image AI, what should be done to avoid exposing women and children to risk? The groups argue that “governments and legislators in each country should, within two years at the latest, ban non-consensual body synthesis tools and regulate them so that no one can access them.”
As concrete measures, they said governments should impose criminal liability on those who generate or distribute non-consensual body synthesis images, as well as on companies and individuals who profit from them. They also proposed that governments impose obligations on app stores and similar platforms to block non-consensual body synthesis AI technologies.
The groups emphasize that such AI-generated sexual images must be stopped “right now,” before they spread further. Prevention before exploitation occurs is crucial. They urged technology companies and individuals to “build image AI to be safe from the design stage and adopt effective safeguards,” and to “recognize clearly that non-consensual body synthesis tools inflict irreparable harm and carry the risk of expanding gender-based violence.”
What can be done when image AIs or outputs that create sexual exploitation material are found? The groups say, “Report them, condemn them, and shut down the spaces where they operate.” Last month, when it became known that sexual exploitation material was spreading via Grok, the Broadcasting Media Communications Commission demanded that X, which operates Grok, submit a user-protection plan. However, there are concerns that protection measures leave many gaps and cannot keep pace with the speed at which image AI proliferates and funnels users into exploitative structures.
Minister Won Min-kyung of the Ministry of Gender Equality and Family said, in response to the statement urging a ban on non-consensual body synthesis, “We strongly welcome the international community’s declaration against digital sex crimes,” adding, “We will actively strengthen cooperation with relevant ministries and the international community to respond to the emerging forms of digital sex crimes.”