21 August 2024

Parliament poised to criminalise sharing sexual 'deepfakes' as survey reveals AI concerns

| Oliver Jacques
Start the conversation
launch of a report in Old Parliament House

Associate Professor Tanya Notley, Western Sydney University; Senator David Pocock; Professor Sora Park, University of Canberra; and Dr Simon Chambers, Western Sydney University: the majority of adults wanted regulation to mitigate potential AI harms. Photo: UWS.

A new law that will criminalise the non-consensual online publishing of sexual material created by artificial intelligence (AI) to make it appear someone is doing something they never did (known as ‘deepfakes’) could be passed by the Senate later this week.

The federal government’s Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 was passed by the House of Representatives in July and referred to a Senate Committee, which recommended it be made law urgently. The bill imposes a six-year imprisonment term for people who share digitally altered pornographic content without the consent of the person depicted, such as superimposing someone’s face on an image of a person having sex.

“[The bill] is currently caught in a legislative logjam in the Senate but may pass as part of a guillotine this week,” Independent ACT Senator David Pocock said.

A guillotine is when a majority of Senators agree to end debate and vote on a bill quickly. The legislation was debated on Monday (19 August) and is expected to become law with the support of the Greens and independents later this week.

Senator Pocock said reform was important given the results of a new survey revealing many Australians had concerns about AI.

He held a press conference at Parliament House to launch Adult Media Literacy in 2024: Australian Attitudes, Experiences and Needs, which was jointly produced by the University of Canberra, University of Western Sydney and Queensland University of Technology.

“The findings from this survey reflect what I have been hearing from the community and align with some of the big debates we are having up in parliament,” Mr Pocock said.

“The survey found that four in 10 adult Australians have experimented with generative AI services … but crucially, what it also identified was strong concern about this technology and that the majority of adults wanted regulation to mitigate potential harms.”

The Senator said it’s not just sexual images that are a problem and wants to see reforms broader than just the sexual deepfakes bill.

“Almost half [40 per cent] of the world’s population will head to the polls this year and we are already witnessing bold attempts to influence election outcomes through AI,” he said.

“From Robocalls in [US] President Biden’s voice telling people not to vote in the New Hampshire primaries to Queensland premier Steven Miles sporting some truly awful dance moves … the risk lies in the realms of deepfake democracy – where a general population with low digital literacy lacking confidence in their own ability to spot AI content – can be fooled … where misinformation takes hold and spreads like wildfire.

“Together with tougher laws and better-resourced regulators, we can manage the risks while maximising the opportunity from our digital world and AI in particular.”

Associate Professor Tanya Notley, who led the media literacy report and survey, called for more education.

“Most adult Australians are not confident about their ability to identify false and misleading information online, create a video and post it online, edit a digital photo, change social media privacy settings, or seek help from relevant authorities if they are being harassed online,” she said.

“We found that there is overwhelming demand among Australians for adult and school-based media literacy education. However, too many Australians have not received any form of media literacy education, or they don’t have access to support when they need it.”

Original Article published by Oliver Jacques on Riotact.

Start the conversation

Be among the first to get all the Public Sector and Defence news and views that matter.

Subscribe now and receive the latest news, delivered free to your inbox.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.