Vintage map

Browse all

Between abuse, consent, and legislative gaps: the problem of AI generated porn

Artificial intelligence used as a weapon against women online

Between abuse, consent, and legislative gaps: the problem of AI generated porn Artificial intelligence used as a weapon against women online

On April 25, 2015, Tiziana Cantone's nightmare began. She became a victim of revenge porn, left unsupported by institutions and justice. Her case, not isolated but highly debated, sparked a public discussion. Four years later, Italy introduced a law against revenge porn under Article 612 of the Penal Code. This law penalizes those who send sexually explicit videos or photos meant to remain private without the consent of the person depicted, as well as those who receive or allow the spread of such content. It’s not enough.

Online Abuse and the Consent Issue

However, the law alone hasn't stopped some individuals from creating groups on platforms like Telegram, exchanging (without the knowledge of those involved) photos taken from social media or received privately of friends, family, or acquaintances, including minors. This includes vivid descriptions of what they would do to them: the more violent, the better. Additionally, it doesn't prevent daily incidents of filming and posting girls on social media for various purposes. Recent examples include a video shot in Manchester depicting a group of girls in evening outfits, now viral and used as an example of alleged contemporary women's lax morality by right wing extremist and putting the girl depicted at a doxxing risk.

The Rise of AI-Generated Porn

In this already concerning trend, some people now use Artificial Intelligence to make the process even easier (and uncontrollable). Dozens of these tools, both free and paid, exist. They can create pornographic or nude videos or images from real photos, modifying them using elements and even utilizing deep fake technology. This is a severe violation of consent and online abuse. Such content, if created with high-quality tools, becomes virtually indistinguishable from real videos, putting victims in trouble, making them feel violated, losing jobs, or damaging their reputation. This is because traditional pornography is no longer enough; now, men want to see their colleagues, friends, and neighbors nude and engaging in explicit activities, a personalized experience for their viewing pleasure.

Cases are on the Rise

According to Time, these tools have never been more popular, and the first testimonies are emerging. Victims like Francesca Mani, Cecilia Luque, Helen Mort, and 11 underage victims in Almendralejo, Spain, have faced deepfake incidents. Cases are increasing, laws are not catching up quickly enough, and not all of them include artificially manufactured images, causing a legislative gap in which this men move. 

What to Do? Associations and Resources

To address this issue, websites and resource collections are emerging to help victims in the US and Italy. Governments, social platforms, and search engines are asked for assistance in eliminating links and ads related to these harmful tools and in limiting the circulation of non-consensual pornography. In Italy, the association "Permesso Negato" helps victims through the process of reporting and recovering from the abuse. However, it's a cultural problem that needs addressing at a deeper level of social and emotional education rather than just limiting public use of Artificial Intelligence. We need to understand boundaries and power imbalances in order to educate men.