close
close

White House urges tech industry to shut down market for sexually abusive AI deepfakes

President Joe Biden’s administration is pushing the technology industry and financial institutions to shut down a growing market for sexually abusive images created using artificial intelligence.

New generative AI tools have made it easy to transform a person’s likeness into a sexually explicit AI deepfake and share these realistic images in chat rooms or social media, leaving victims – be they celebrities or children – with little recourse to prevent this.

The White House on Thursday is calling on companies to cooperate voluntarily in the absence of federal law. By committing to a series of concrete actions, officials hope the private sector can curb the creation, distribution and monetization of such non-consensual AI images, including explicit images of children.

“When generative AI came on the scene, everyone was speculating about where the first real dangers would emerge. And I think we have the answer,” said Biden’s chief science adviser Arati Prabhakar, director of the White House Office of Science and Technology Policy.

She told the Associated Press a “phenomenal increase” in the distribution of images without their consent, driven by AI tools, targeting women and girls in particular in ways that can turn their lives upside down.

“Whether you’re a teenager, whether you’re a gay kid, these are the issues that people are facing right now,” she said. “We’ve seen an acceleration because generative AI is evolving very quickly. And the fastest thing that can happen is for companies to step up and take responsibility.”


Evniax Vespa Promo

A document seen by AP ahead of its release Thursday calls for action not only from AI developers, but also from payment processors, financial institutions, cloud computing providers, search engines and the gatekeepers — namely Apple and Google — that control what makes it into mobile app stores.

The private sector must step up to stop the “monetisation” of image-based sexual abuse, in particular by restricting payment access for websites that promote explicit images of minors, the government said.

Prabhakar said many payment platforms and financial institutions have already stated that they will not support companies that distribute offensive images.

“But sometimes it’s not enforced; sometimes those terms of service don’t exist,” she said. “And that’s an example of something that could be enforced much more strictly.”

Providers of cloud services and app stores for mobile devices could also “restrict web services and mobile applications that are marketed with the aim of creating or modifying sexual images without the consent of the persons concerned,” the document says.

And whether the photo was taken by artificial intelligence or a real nude photo was posted online, survivors should be able to get online platforms to remove it more easily.

The most high-profile victim of pornographic deepfake images is Taylor Swift, whose ardent fanbase hit back in January when offensive AI-generated images of the singer-songwriter began circulating on social media. Microsoft vowed to strengthen its security measures after some of the Swift images were traced back to its AI visual design tool.

More and more schools in the US and elsewhere are also grappling with AI-generated deepfake nude images of their students. In some cases, other teens have also been found to be creating AI-manipulated images and sharing them with classmates.

Last summer, the Biden administration joined Amazon, Google, Meta, Microsoft and other major technology companies in a voluntary commitment to impose a series of safeguards on new AI systems before they are released to the public.

In response, Biden signed an ambitious executive order in October to guide the development of AI so that companies can benefit from it without compromising public safety. While the focus was on broader AI issues, including national security, it also addressed the emerging problem of AI-generated child abuse images and the search for better ways to detect them.

But Biden also said the government’s AI safeguards would need to be backed by legislation. A bipartisan group of U.S. senators is now urging Congress to spend at least $32 billion over the next three years to develop artificial intelligence and fund measures to safely control it, but has largely put off calls to enshrine those safeguards in law.

Encouraging companies to get involved and make voluntary commitments “does not change the fundamental need for Congress to take action here,” says Jennifer Klein, director of the White House Gender Policy Council.

Longstanding laws already criminalize the production and possession of sexual images of children, even if they are fake. Federal prosecutors filed charges earlier this month against a Wisconsin man who allegedly used a popular AI image generator called Stable Diffusion to create thousands of AI-generated realistic images of minors engaged in sexual acts. A lawyer for the man declined to comment after his arraignment on Wednesday.

But there is little control over the technical tools and services that enable the creation of such images. Some of them are located on dubious commercial websites that reveal little information about who runs them or what technology they are based on.

The Stanford Internet Observatory announced in December that it had found thousands of images of suspected child sexual abuse in the massive AI database LAION, an index of online images and captions used to train leading AI image producers such as Stable Diffusion.

London-based Stability AI, which owns the latest versions of Stable Diffusion, said this week that it had “not approved” the release of the earlier model the Wisconsin man allegedly used. Such open-source models are difficult to rebottle because their technical components are publicly available on the internet.

Prabhakar said it is not just open source AI technology that is causing harm.

“It’s a broader problem,” she said. “Unfortunately, a lot of people in that category seem to be using image generators. And we’ve just seen such an explosion in that space. But I don’t think it’s neatly divided into open source and proprietary systems.”

Photo credit: AP