Non-consensual AI porn doesn’t violate privacy – but it’s still wrong | The Conversation

All entries on Feminist Legal Clinic’s News Digest Blog are extracts from news articles and other publications, with the source available at the link at the bottom. The content is not originally generated by Feminist Legal Clinic and does not necessarily reflect our views.

In 2024, Australia amended its criminal code to explicitly include AI-generated porn in the law against distributing sexual material of others without their consent. As a result, digitally manipulated sexual imagery of others now falls within the same legal category as genuine photographs or video footage.

There are gaps in this legislation. Most notably, the relevant offence prohibits transmitting such material via a carriage service (such as the internet). But there is no standalone offence for creating such material. Only sharing is explicitly prohibited.

As the law doesn’t clearly prohibit private creation and use of deepfake pornography, individuals must make their own moral choices.

Most commonly, deepfake pornography has been described as a privacy violation. It’s easy to see the appeal of this view.

However, there is a problem with the privacy argument.

[P]rivacy concerns information that is particular to us – such as identifying details about our bodies, or how we express ourselves sexually.

Assumptions we make based on generic facts about humans are different. You can violate someone’s privacy by sharing specific details from their sexual history. You can’t violate their privacy by announcing they probably have nipples, and probably sometimes have sex.

While powerful in some respects, AI tools can’t reveal the genuinely private aspects of our sexual lives.

Source: Non-consensual AI porn doesn’t violate privacy – but it’s still wrong

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.