Jueves, 08 de diciembre, 2022

  • Women and girls face re-traumatization trying to remove explicit content from internet
  • Google’s inadequate reporting system fails to deliver swift and transparent responses


Survivors of online sexual abuse in South Korea have told Amnesty International their suffering has been compounded by Google’s slow and convoluted system for processing content takedown requests.

Women and girls targeted with digital sex crimes said the process for reporting non-consensual explicit content on Google was so difficult to navigate that videos of sexual abuse have ended up proliferating online.

“As a wave of digital sex crimes in South Korea causes severe harm to the women and girls who have been targeted, Google’s inadequate system for reporting non-consensual explicit content is making matters even worse,” said Jihyun Yoon, Director of Amnesty International Korea. 

“Google must do more to prevent the spread of online gender-based violence – not just in Korea, but everywhere. Survivors around the world are forced to use this same flawed reporting system when they try to get harmful content removed, so it is highly likely this issue extends way beyond Korea.”

Amnesty International today launched a global petition calling on Google to address the flaws in its reporting system.

Digital sex crimes rise despite “Nth Room” case

In March 2020, a group of South Korean journalists exposed the existence of eight secret chat rooms on messaging app Telegram where thousands of videos of women and girls containing explicit non-consensual sexual content, were being sold using cryptocurrency without their consent. Korean police said more than 60,000 people participated in the crimes by entering these rooms, collectively known as the “Nth Room” case.

In October 2021, one of the “Nth Room” case chat operators was sentenced to 42 years in jail. However, digital sex crimes are continuing, and what distinguishes them from other instances of sexual violence is the additional harm caused by the ease of repeated sharing and distribution.

Recent criminal cases show that perpetrators habitually threaten survivors with existing video content to force them into producing more sexually abusive content.  This shows that unless the non-consensual content and personal information of survivors is deleted, women and girls are subjected to further harm or crimes even when the original perpetrators are punished.

“Deleting non-consensual explicit content circulated online is essential for restoring the survivors’ daily lives. These women have no choice but to put their faith in removal requests to tech companies, despite the painful process of having to repeatedly search for and collect the non-consensual explicit content they are featured in,” said Jihyun Yoon.

“When these requests are not processed quickly and the abusive content can be re-distributed at any time, survivors are exposed to prolonged physical and mental harm.”

Google’s poorly functioning reporting system

Google says non-consensual explicit content can be removed on request. However, the survivors and activists who spoke with Amnesty International Korea said Google’s reporting categories and procedures were confusing and difficult to follow. The appropriate forms were difficult to find, and they contained ambiguous categories about the type of content being reported.

After a complaint had finally been successfully submitted, users then faced a lack of communication on the progress of their claim – often for months on end.

Google has a lot of advantages – you can easily get the information you want. But to the victims, Google is nothing more than a huge distribution website. It is the worst website in terms of secondary victimization.

Survivor Hyun-jin

Amnesty International Korea carried out a survey of 25 survivors and activists – and all 11 of those who made complaints via Google said it was difficult to confirm whether their requests had been properly processed. This was mainly due to a lack of communication from Google during the reporting process.

Survivor *Hyun-jin waited just over a year between receiving a confirmation receipt from Google and finally being informed of the outcome of a series of removal requests she had sent.

Google also appeared to fail to take into account survivors’ trauma when crafting its takedown procedures. When reporting content, users must tick a box saying they recognize they can be punished if the submission is not true, while Google states it will not process incomplete complaints.

Hyun-jin said these guidelines heightened her anxiety: “I submitted it with difficulty, but rather than being convinced that it would be deleted, I became more anxious because I thought that if it didn’t work, it would be my responsibility.”

She has since prepared a 600-word model response explaining in detail why the content is unlawful, and she has shared it with other survivors to help them make removal requests.

One of the reporting forms provided by Google also requires a “photo ID” to be attached when submitting a report, failing to take into account the fact that survivors who have had explicit material distributed without their consent fear sharing their image online.

“Asking survivors to post photo IDs online, where videos of victims are circulating, is nothing short of traumatic,” said Dan of activist group Team Flame.

Google ‘the worst website in terms of secondary victimization’

Amnesty International interviewed four survivors of online gender-based violence along with six activists who have been supporting them. All survivors reported damage to their physical and mental health, including a need to isolate themselves from society to avoid stigma.

While the sexual abuse and its dissemination online already caused immense harm to these survivors, this was exacerbated when they were confronted with the slow and confusing process of trying to remove the content from the internet.

“It was so easy [for the perpetrator] to upload a video, but it took months to get it removed,” survivor Hyun-jin told Amnesty International.

She had gone to the police after a non-consensual video of her, containing sexually explicit content, was distributed online. She wrongly assumed the video would soon be deleted.

“When you are victimized like this, you have no idea what to do. I was looking at my phone and googling my name the whole day. I could barely sleep an hour a day, spending most of my time searching. I had constant nightmares, but reality itself was more of a nightmare.

“In order to delete videos, images and search keywords, I had to take hundreds of screenshots and report these to Google. I couldn’t ask someone else to do all this for me because I had to attach these harmful images depicting myself when I reported them. I had to face it all alone.

“Google has a lot of advantages – you can easily get the information you want. But to the victims, Google is nothing more than a huge distribution website. It is the worst website in terms of secondary victimization. The other day, I checked the URLs with the distributed content, and [the search results] were over 30 pages long. They weren’t even easily removed on request. [Yet] I can’t help but continue making removal requests.”

Responsibility of tech companies to prevent harm on their services

Google’s inadequate reporting system is difficult to navigate, inconsistent and hard to track, resulting in a failure to deliver swift and transparent responses to survivors.

The responsibility of all companies to respect human rights has been well articulated in the UN Guiding Principles on Business and Human Rights, which state that all business enterprises are required to avoid causing or contributing to adverse human rights impacts through their own activities, and to address such impacts when they occur.

Google’s own human rights policy states its commitment to “upholding the standards established in the United Nations Guiding Principles on Business and Human Rights”.

“By responding inconsistently and slowly to removal requests by survivors of digital sex crimes, Google is failing to respect human rights. It must adopt a survivor-centered reporting system that prevents re-traumatization and is easy to access, navigate and check on,” Jihyun Yoon said.

“Google must ensure that online gender-based violence does not occur on its services. Survivors of digital crimes need to be helped by Google’s reporting mechanisms, rather than having their suffering needlessly prolonged by them.”

Amnesty International wrote to Google on 11 November requesting a response to its findings. Google did not provide an official response but stated in a private meeting that this issue is of importance and the company wants to improve how it responds to it.

*All survivors’ real identities have been protected at their request.