The US Government has announced a commitment from the AI industry to reduce image-based sexual abuse.
The “voluntary commitments,” which cover both AI model developers and data providers, commits technology firms to act against non-consensual intimate images and child sexual abuse material.
According to the White House, the code of practice builds on commitments made last year by industry, to act to reduce the risks of AI, by ensuring “safety, security and trust.”
In the new commitments, AI firms will move to reduce sexual abuse through AI-generated images.
Under the arrangement, Adobe, Anthropic, Cohere, Common Crawl, Microsoft and OpenAI will commit to “responsibly sourcing” datasets and to safeguard them from image-based sexual abuse.
The firms will improve their development processes, and add feedback loops, to prevent AI models from creating sexual abuse images. They also committed to, “where appropriate,” removing nude images from AI training models.
“Image-based sexual abuse – both non-consensual intimate images (NCII) of adults and child sexual abuse material (CSAM), including AI-generated images – has skyrocketed,” the White House said, announcing the commitments. “This abuse has profound consequences for individual safety and well-being.”
The commitments are among several initiatives from the technology and AI industry to reduce the threats from AI abuse images.
Read more about White House initiatives for safe AI: Biden Issues Executive Order on Safe, Secure AI
These include moves by Cash App and Square to curb payments to companies promoting image-based sexual abuse, and expanded participation in initiatives that help detect sextortion.
Google is updating its platform, including its search engine, to combat non-consensual intimate images (NCII), and Microsoft has worked to tackle NCII on Bing and to signpost resources for victims.
Meta, Snap and Github have also acted against NCII and tools that can share them, including AI content. Meta, for example, has removed 63,000 accounts that were involved in sextortion scams in July alone.
A working group, including technology firms, civil society groups and researchers will also investigate how to “identify interventions to prevent and mitigate the harms caused by the creation, spread, and monetization of image-based sexual abuse.” The group will adopt a set of voluntary principles to combat image-based abuse, the White House said.