close
close

New deepfake bill would require Big Tech to monitor and remove sexually explicit AI images

WASHINGTON — Lawmakers on Capitol Hill are desperately trying to get a handle on the boom in artificial intelligence-powered deepfake pornographic images targeting everyone from celebrities to high school students.

A new bill would now require social media companies to monitor and remove sexually explicit deepfake images posted on their sites without consent. The measure would criminalize the posting or threat of posting deepfake pornography.

Republican Senator Ted Cruz of Texas is the bill’s lead sponsor. Cruz’s office provided CNBC with exclusive details about the bill.

The Take It Down Act would also require social media platform operators to develop a process to remove the images within 48 hours of receiving a valid request from a victim. In addition, the sites would have to make reasonable efforts to remove all other copies of the images, including those shared in private groups.

The task of enforcing these new rules would fall to the Federal Trade Commission, which is responsible for regulating consumer protection.

Cruz’s bill will be formally introduced by a bipartisan group of senators on Tuesday. They will be joined at the Capitol by victims of deepfake porn, including high school students.

The rise in AI images created without students’ consent has impacted celebrities like Taylor Swift, politicians like Rep. Alexandria Ocasio-Cortez (D-N.Y.) and high school students whose classmates have taken pictures of their faces and created nude photos or pornographic images using apps and AI tools.

“By leveling the playing field at the federal level and giving websites the responsibility to establish processes to remove these images, our bill will protect and empower all victims of this heinous crime,” Cruz said in a statement to CNBC.

Dueling Senate bills

According to a report by Home Security Heroes, producers of non-consensual, sexually explicit deepfakes increased their production by 464% in 2023 compared to the previous year.

Although there is broad consensus in Congress that deepfake AI pornography must be addressed, there is no agreement on how to do it.

Instead, the Senate has two competing bills before it.

Senator Dick Durbin (D-Illinois) introduced a bipartisan bill earlier this year that would allow victims of deepfakes created without consent to sue the people who possessed, created or distributed the image.

Under Cruz’s bill, deepfake AI pornography would be treated like highly offensive online content, meaning social media companies would be responsible for moderating and removing the images.

When Durbin tried to bring his bill to a vote last week, Republican Senator Cynthia Lummis of Wyoming blocked the bill, saying it was “too broad” and could “hinder technological innovation in America.”

Durbin defended his bill by saying, “Under this bill, there is no liability for technology platforms.”

Lummis is one of the original co-signers of Cruz’s bill, along with Republican Senator Shelley Moore Capito and Democratic Senators Amy Klobuchar, Richard Blumenthal and Jacky Rosen.

The new bill also comes as Senate Majority Leader Chuck Schumer, D-N.Y., is pushing his chamber to move forward with AI legislation. Last month, an artificial intelligence task force released a “roadmap” on key AI issues that included developing laws to combat the “non-consensual distribution of intimate images and other harmful deepfakes.”