close
close

Bill on deepfake sexual material introduced

On June 5, 2024, Attorney General Mark Dreyfus introduced the Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 (Bill).

The bill changes the Criminal Code 1995 (criminal code) and focuses on the creation and “non-consensual sharing of sexually explicit material.” This includes content created or distorted through technologies such as deepfakes.

The statement states: “As technology advances AI and machine learning, deepfake techniques are becoming more sophisticated, making it almost impossible to detect deepfake material.

“The use of technology … to create fake sexual material poses significant risks to the Australian community and the non-consensual sharing of this material can have long-term harmful effects on victims.”

In his second reading speech, Dreyfus stressed that “the sharing of (d)igitally created and modified sexually explicit material without consent is a harmful and deeply disturbing form of abuse.”

Dreyfus also stressed: “This insidious behaviour is degrading, humiliating and dehumanising for the victims. Such acts are predominantly directed against women and girls and perpetuate harmful gender stereotypes and gender-based violence.”

The bill introduces new offences and penalties. It repeals section 474.17A of the criminal code and replaces it with a new section dealing with the use of a transportation service to transmit sexual material without consent.

Dr Carolyn McKay, senior research fellow and co-director of the Sydney Institute of Criminology at the University of Sydney’s Faculty of Law, says: “The bill repeals some existing offences in the Commonwealth Criminal Code and introduces a consent-non-consent model.”

McKay says: “One thing is interesting: (the bill) does not define consent for the purposes of these new provisions… it relies on the kind of ordinary meaning of the term, which seems to be based on the idea of ​​free and voluntary consent to the sharing (of the sexual material).”

Under proposed new section 474.17A, a person commits an offence if he or she uses a carriage service to transmit material to another person that shows or appears to show the other person in a sexual pose or activity. The person commits an offence if the person knows that the other person does not consent to the transmission of the material or if the person does not care whether the other person consents to the transmission of the material.

An important aspect of the bill is that it specifically refers to adults, not children. If children are involved in the conduct, “the existing offences of sexual exploitation of children still apply,” says McKay.

Despite its name, the bill does not explicitly refer to artificial intelligence or deepfakes. “…(I)t is probably usefully worded quite broadly and just talks about technology…that would include the current technology that we have today, which would be (including)…Photoshop or Photoshop-like technologies, as well as the increasing use of apps, AI apps and deepfake apps,” McKay says.

Several studies have demonstrated the harm and suffering that can be caused by the non-consensual sharing of sexual images. “… AI … is reaching a point where many people can no longer tell the difference between an AI image, a deepfake and reality,” says McKay.

“…(T)his newer legislation is perhaps more aligned with and more clearly focused on these new technologies,” she says.

In Victoria, police are investigating the creation and distribution of clearly fake images of students at a Melbourne school.

Dr Asher Flynn, Associate Professor of Criminology at Monash University and Principal Investigator at the Centre to Eliminate Violence Against Women (CEVAW), believes the problem requires a multi-faceted response.

“It reflects the cultural and societal attitude towards women and especially young girls. They are turned into objects.”

“By doing this, you are sending me the message that your body, your image, is there for me and I can use it as I want,” she says.

According to Flynn, the real problem is that there appears to be a “normalization” of sexualized content being shared by people without their consent, especially women and young girls.

Another significant problem is the accessibility of the applications or tools used to create deepfake images. “That’s one of the really scary elements,” says Flynn. A few years ago, to be able to create these images, you had to have certain skills, have access to certain technology, or pay someone to do it. But now, anyone can download an application that allows them to create such images.

“I think the increased accessibility of these things is a scary phenomenon. Hopefully this law will help prevent that,” Flynn says.

“It sends a really clear message to the community that we take sexual abuse through deepfakes seriously and that there will be consequences for people who use these technologies in really harmful ways.”

Flynn says Victoria is currently the only jurisdiction where the creation of deepfake images is a criminal offence.

“The focus of this (Commonwealth) law is on the distribution of the images,” she says. “What I found disappointing was that the creation of a sexualized deepfake was not classified as a separate offence.”

So what causes people to create deepfake images of another person?

In 2022, Flynn and some of her colleagues from various universities in Australia and the UK conducted a study on deepfakes and the prevalence of the use of digitally altered images as a form of sexual abuse (published in The British Journal of Criminology (2022) 62, 1341-1358). Flynn and her colleagues found that more than 14 percent of people experienced forms of deepfakes and “misuse of digitally altered images.”

“Our research has shown that this type of abuse varies greatly. Often it is about directly hurting or humiliating the victim or getting revenge for something. But sometimes these images are also presented in a humorous way. For example, in the case of men who have done this to their boyfriend who is about to get married,” she says.

Regardless of the different reasons why people behave in this way, the bill provides for penalties for those who are prosecuted and found guilty.

They range from six years in prison for using a delivery service to transmit sexual material without consent to seven years for serious offences involving the transmission of sexual material without consent. However, McKay points out that the sentence is only a “maximum” and not a mandatory sentence. “So there is still some judicial discretion in sentencing,” she says.

According to Flynn, this is “considered a fairly high penalty in the context of other crimes we have investigated.”

“It will be interesting to see, but I think … it sends the message that we treat this as a serious form of sexual violence.”