close
close

Congress launches bipartisan initiative to combat explicit AI deepfakes

A group of bipartisan lawmakers has introduced a bill to combat the multitude of websites that distribute sexually explicit deepfakes without consent.

In recent months, Republicans and Democrats have introduced several bills designed to hold parties accountable for the distribution of deepfake pornography and give victims the opportunity to seek financial compensation.

The term “deepfake” refers to images or videos that depict people in fake situations, often using artificial intelligence. A 2019 study by Deeptrace Labs found that 96% of all deepfake videos were non-consensual pornography. These sites allow anyone to use generative AI to create realistic and explicit depictions of another person without their consent, even children. What once required hundreds of images and computer editing skills now requires one or two photos and a mobile phone.

There are currently no federal laws preventing these websites from operating.

“There are now hundreds of apps that can create non-consensual, sexually explicit deep fakes right on your phone,” said Senate Judiciary Committee Chairman Dick Durbin (D-IL), Politico in May. “Congress must address this growing crisis as quickly as possible.”

For over a year, advocacy groups have been urging Congress to take action against these malicious websites as numerous deepfakes of prominent public figures such as Taylor Swift and Representative Alexandria Ocasio-Cortez (D-NY) have surfaced.

Ocasio-Cortez, who has been personally affected, worked with Durbin to push the DEFIANCE Act, which is intended to “stop the distribution of non-consensual, sexually explicit deepfakes” by creating a “federal civil remedy” for victims. Lawsuits could be enforced “against individuals who produced or possessed (deepfakes) with the intent to distribute them against the will of the subjects depicted.”

Republicans are taking a different but complementary approach, focusing on criminalizing these fakes. Rep. Nancy Mace (R-SC) has introduced two bills that would increase fines for distributing non-consensual pornography from $150,000 to $500,000.

Senators Ted Cruz (Republican, Texas) and Amy Klobuchar (Democrat, Minnesota) have proposed the bipartisan TAKE IT DOWN Act, which would criminalize both the “publication” and “threat to publish” of non-consensual AI deepfakes. The bill would also require all websites and social media outlets to remove the content from their feed to minimize distribution.

But some lawmakers fear that tech companies will invoke Section 230 of the Communications Decency Act to find a loophole. The 1996 law protects tech giants from any liability related to their users’ content. Carrie Goldberg, a lawyer who represents many of Harvey Weinstein’s accusers, is calling for the law to be repealed entirely, saying it puts the concerns of Big Tech companies ahead of those of internet victims.

“The best way to deal with so much of the harm that happens on platforms is for the platforms themselves to bear the costs and liability,” Goldberg told the hill.

As the election approaches, the legislative machinery seems to have almost ground to a halt as many lawmakers hope to keep their seats. When Durbin introduced the DEFIANCE Act in the Senate, it was quickly vetoed by Senator Cynthia Lummis (R-WY), even though she supported the TAKE IT DOWN Act. Lummis worried that the language was too broad and potentially damaging to online privacy and innovation while ultimately neglecting victims.

CLICK HERE TO READ MORE FROM THE WASHINGTON EXAMINER

Despite inevitable conflicts, the support from both sides is overwhelming.

“Victims of involuntary pornographic deepfakes have waited too long for federal legislation that holds perpetrators accountable,” Ocasio-Cortez said. “Congress must act to show victims they will not be abandoned.”