close
close

Australia introduces bill to criminalise sexually explicit deepfakes – JURIST

Australian Attorney-General Mark Dreyfus introduced the Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 to Parliament on Wednesday. Dreyfus and Australian Prime Minister Anthony Albanese both believed that non-consensual deepfake of sexually explicit material was a “harmful and deeply disturbing form of abuse” and should therefore be punishable by severe criminal penalties. The bill was introduced and read in Parliament and now awaits a second reading.

According to the bill, sharing sexually explicit deepfake content without consent could be punishable by up to six years in prison. The bill also creates two serious criminal offenses targeting repeat offenders and the creators of the content. Both serious offenses are punishable by up to seven years in prison. However, the new offenses only apply to sexual material involving adults, and child abuse material will continue to be treated with specific, separate charges.

Dreyfus acknowledged in an interview on local radio that the anonymity of social media accounts may make it difficult to identify and prosecute those who share deepfakes online. He also reiterated that technological means of tracing could overcome the difficulties of detection and prosecution. Therefore, these difficulties are no reason not to ban the creation of non-consensual deepfake material with sexual content and to impose severe criminal consequences on this socially undesirable activity.

In addition to the new bill, the government is also promising to expand on previous initiatives, including increasing funding for the eSafety Commissioner, tackling harmful behaviour such as doxxing, and overhauling the Privacy Act to give all Australians, particularly women who are victims of domestic violence, more control over their personal information.

Discussions about introducing the law began in May when federal politicians met to discuss Australia’s gender-based violence crisis. Albanese said the law was designed to “protect women,” citing the growing amount of deepfake pornographic content of women in Australia. A 2023 report by social media analytics firm Graphika found a 2000 percent increase in the number of websites using artificial intelligence to create non-consensual sexual images. The federal government also provided AU$1 billion in funding for the effort.