The European-wide battle to crack down on revenge porn

The European-wide battle to crack down on revenge porn

Irish artist Emmalene Blake works on a mural, located in Dublin’s city centre, on 26 November 2020. The artist’s latest work is related to Irish legislation outlawing so-called ‘revenge porn’, which is sexual abuse based on images.

(NurPhoto via AFP/Artur Widak)

In March 2019, a school friend contacted Anna N. (not her real name), who was 27 at the time and based in Berlin. Her friend said there were pictures of her online that weren’t supposed to be there. Anna found a dozen photos of herself, some nudes, some bikini shots, some headshots, published on various porn sites without her knowledge or consent. “There was also a picture of my Facebook profile which resulted in me getting a lot of private messages. I got many ‘compliments’ on my breasts. Also, many dick pics,” she tells Equal Times in a phone interview. Later, Anna realised that her Dropbox was hacked, and concluded that the photos were taken from there, although she has never learned who the culprit was.

Anna’s story is hardly an isolated incident. Online image-based sexual abuse (also called ‘revenge porn’) is an umbrella term that encompasses different abusive behaviours such as non-consensual sharing and or/taking of nude or sexual images, the threat to do so, or the construction of fake pornographic photos and videos using digital manipulation techniques (so called ‘deepfakes’). The phenomenon has become mainstream in recent years, with several high profile cases of celebrities having their explicit photos and videos shared.

It is hard to pin down the numbers, because the data is incomplete and inconsistent across Europe, but the UK-based charity Revenge Porn Helpline, says that the number of reports doubled in 2020, reaching a record number of 3,146 cases, and continues to rise. Meanwhile, in the period between 2015 and 2020, the helpline removed nearly 200,000 pieces of content shared illegally. According to the Irish organisation Victims Alliance, in 2020 over 100, 000 images of Irish women and girls were leaked online, and a 2021 survey conducted on 2,000 people between the ages of 18 and 80 from across the European Union found that 30 per cent of women polled feared that fake intimate images of them might be shared without their consent.

On the internet, sexually explicit or intimate images are shared quickly, easily, and often without repercussions, even though many countries have passed laws that criminalise this act. Many of the victims face an uphill battle when trying to seek justice or remove the explicit content from the public domain. But recent changes in EU and UK legislation, to regulate tech platforms, might make it easier for them to remove the content and hold the platforms sharing it to account.

An evolving landscape of digital image-based abuse

The sharing of intimate images isn’t a new phenomenon. The 2020 book Image-based sexual abuse: A study of the causes and consequences of non-consensual nude or sexual imagery recalls that in the 1980s, the US porn magazine Hustler had a feature inviting readers to submit photos of women’s vaginas for publication. Several women ended up suing Hustler for publishing their images without consent.

Sophie Mortimer, manager of the Revenge Porn Helpline, explains that the landscape of online intimate image-sharing has largely evolved in recent years. “There are images being shared after a breakup, which corresponds to the general definition of ‘revenge porn’, but we are seeing more and more images being shared on online forums, just for fun. There’s also a trend where these images are being used as a pattern of control in relationships. It might be a way of coercing a person to stay in a relationship; or they are used in court proceedings to denigrate one party,” says Mortimer. She also notes that most of the victims wish for the images to be taken down as quickly as possible, and do not necessarily wish to press charges.

Recently, Revenge Porn Helpline spearheaded the StopNCII.org initiative, supported by Meta (Facebook and Instagram’s parent company), to help people proactively thwart the sharing of their intimate images on tech platforms.

If somebody is concerned that their intimate pictures might have been posted online, they can create a case on the StopNCII.org platform. Its ‘hash-generating technology’ will assign a unique numerical code to an image, and create a secure digital fingerprint, which will be shared with the tech companies participating in the scheme so that they can detect if someone has shared or is trying to share those images on their platforms.

This feature will also prevent the further sharing of photos. However, for the moment, this technology is limited only to Facebook and Instagram, and doesn’t concern content posted on porn sites.

When Anna tried to remove her photos from pornographic sites, she faced several hurdles. “On some websites, there is a contact form, but for some, I needed to search for contacts. Some platforms require you to leave your name and email address and provide links to the content you want to remove, which made me feel uncomfortable. Sometimes you don’t even receive a notification that they received your message, nor that the content was taken down. I was just refreshing the sites regularly,” she recalls.

Anna started the website Anna Nackt to help other victims in the process. “In the beginning, I was literally typing into Google, ‘My naked pictures are online, what can I do?’ On my website, I wanted to put up some basic information like what to do if porn platforms don’t reply, or how you can set up an anonymous email address to report the content, or how you speak to the police,” she says.

Julia Słupska, a doctoral student at the Centre for Doctoral Training in Cybersecurity at the University of Oxford and the Oxford Internet Institute emphasises how unfair it is that the burden of removing the content is left to the victims. “The victims-survivors [are forced] to take on a customer service role. They either need to figure it out themselves or find people who will help them navigate through the whole process, and this should not be their job – especially at a time when they’re extremely fragile. The companies should be doing this, and they need to invest more resources,” she says.

Legislative solutions

In December 2020, the European Commission proposed changes to the Digital Services Act (DSA), which has remained largely unchanged since the adoption of the e-Commerce Directive in 2000. The proposals to update the current EU legal framework governing digital services aim to better regulate big tech platforms and hold them accountable. A further updated text, which introduced several changes to the Commission proposal, was voted in by the European Parliament this January.

“The Digital Services Act is very important because it defines the rights of users towards all sorts of online platforms, porn platforms included,” says Josephine Ballon, lawyer and head of legal at Hate Aid, a German organisation that offers protection from digital violence and supports the sanctioning of perpetrators. “Across the EU, we have a big problem identifying perpetrators, because we are dealing with a very special situation here. We have platforms located abroad, national law enforcement, and criminal laws that are not harmonised within the EU. And on the internet, everything is cross-border. The police and law enforcement are not sensitive enough to these topics. It is as if they separate the real world and the internet, which is not possible anymore,” she says.

Alexandra Geese, a German MEP and shadow rapporteur for the Greens/EFA on the DSA, agrees: “[In our proposed amendments] we decided to focus on the porn platforms – the sites that have more than 50 per cent of the porn content – and we focused on three different measures: stronger identification requirements for the people who upload the porn content; human content moderation; and better and quicker removal of materials for the victims.” This means that those who upload the content will need to identify themselves with an email and phone number, while the victims would be able to remove the content without giving away their personal data.

At the same time, earlier this month the UK government proposed changes to the Online Safety Bill, including provisions outlawing content that features, among other things, revenge porn. If the platforms fail in their duty of care, which also includes acting on harmful content, the UK media regulator Ofcom will be given the power to fine them up to £18 million, or ten per cent of qualifying revenue. The Online Safety Bill is to be introduced to parliament over the next few months.

The final text of the DSA is yet to be negotiated with the European Council, but Geese isn’t confident that the Council will fully support it. She also warns that, even if the text gets voted in its entirety, it will remain limited in scope.

“It only targets porn platforms. Most groups on Telegram or WhatsApp, where intimate and porn images are being exchanged en masse, are excluded from the legislation, as they are considered private communication,” she notes. The identification process is also faulty. “In some EU countries, one can buy a number without any ID. This means the legislation will not stop criminal organisations, although it should deter a large majority of the population from uploading illegal content.”

Moreover, while further regulation of platforms represents an important milestone for the victims of digital image-based abuse, it doesn’t mean that the perpetrators will be discovered and punished, as this is the competency of national legislation. Anna contacted the police in Germany, where she’s based, but they’ve never found the culprit, nor did they know how best to help her. “When I first contacted them, they suggested I ‘call Google’,” she remembers. For many women, the reporting process can add trauma to the trauma. “From the victims in Germany and other countries, we hear that the police are not sensitised to these topics. They often blame women and ask them if they uploaded the material,” says Ballon.

Słupska believes that a gender-based approach to both the legislation and to cybersecurity work could make a difference. “Cybersecurity is often discussed at the level of the military, financial institutions or data breaches, but it is not preventing crimes that disproportionately affect women. A gendered approach to cybersecurity would take into account social structures and look into how insecurity, gender, social status and/or race interact with the vulnerability to online attacks,” she explains.

This investigation is supported by a grant from the IJ4EU fund. The International Press Institute, the European Journalism Centre and any other partners in the IJ4EU fund are not responsible for the content published, or any use made out of it.