MYANMAR: FACEBOOK’S SYSTEMS PROMOTED VIOLENCE AGAINST ROHINGYA; META OWES REPARATIONS

Thursday, September 29, 2022

The Social Atrocity: Meta and the right to remedy for the Rohingya,details how Meta knew or should have known that Facebook’s algorithmic systems were supercharging the spread of harmful anti-Rohingya content in Myanmar, but the company still failed to act


Facebook owner Meta’s dangerous algorithms and reckless pursuit of profit substantially contributed to the atrocities perpetrated by the Myanmar military against the Rohingya people in 2017, Amnesty International said in a new report published today.

The Social Atrocity: Meta and the right to remedy for the Rohingya,details how Meta knew or should have known that Facebook’s algorithmic systems were supercharging the spread of harmful anti-Rohingya content in Myanmar, but the company still failed to act.

“In 2017, the Rohingya were killed, tortured, raped, and displaced in the thousands as part of the Myanmar security forces’ campaign of ethnic cleansing. In the months and years leading up to the atrocities, Facebook’s algorithms were intensifying a storm of hatred against the Rohingya which contributed to real-world violence,” said Agnès Callamard, Amnesty International’s Secretary General.

Meta must be held to account. The company now has responsibility to provide reparations to all those who suffered the violent consequences of their reckless actions.

Agnès Callamard, Amnesty International Secretary General

“While the Myanmar military was committing crimes against humanity against the Rohingya, Meta was profiting from the echo chamber of hatred created by its hate-spiralling algorithms.

“Meta must be held to account. The company now has a responsibility to provide reparations to all those who suffered the violent consequences of their reckless actions.”

Sawyeddollah, a 21-year-old Rohingya refugee, told Amnesty International: “I saw a lot of horrible things on Facebook. And I just thought that the people who posted that were bad… Then I realized that it is not only these people – the posters – but Facebook is also responsible. Facebook is helping them by not taking care of their platform.”

The Rohingya are a predominantly Muslim ethnic minority based in Myanmar’s northern Rakhine State. In August 2017, more than 700,000 Rohingya fled Rakhine when the Myanmar security forces launched a targeted campaign of widespread and systematic murder, rape and burningof homes. The violence followed decades of state-sponsored discrimination, persecution, and oppression against the Rohingya that amounts to apartheid.

© Tamara-Jade Kaz
Artwork commissioned for publication of research looking at the Rohingya right to remedy from Meta

An anti-Rohingya echo chamber

Meta uses engagement-based algorithmic systems to power Facebook’s news feed, ranking, recommendation and groups features, shaping what is seen on the platform. Meta profits when Facebook users stay on the platform as long as possible, by selling more targeted advertising. The display of inflammatory content – including that which advocates hatred, constituting incitement to violence, hostility and discrimination – is an effective way of keeping people on the platform longer. As such, the promotion and amplification of this type of content is key to the surveillance-based business model of Facebook.  

In the months and years prior to the crackdown, Facebook in Myanmar had become an echo chamber of anti-Rohingya content. Actors linked to the Myanmar military and radical Buddhist nationalist groups flooded the platform with anti-Muslim content, posting disinformation claiming there was going to be an impending Muslim takeover, and portraying the Rohingya as “invaders”.

In one post that was shared more than 1,000 times, a Muslim human rights defender was pictured and described as a “national traitor”. The comments left on the post included threatening and racist messages, including ‘He is a Muslim. Muslims are dogs and need to be shot’, and ‘Don’t leave him alive. Remove his whole race. Time is ticking’.

Content inciting violence and discrimination went to the very top of Myanmar’s military and civilian leadership. Senior General Min Aung Hlaing, the leader of Myanmar’s military, posted on his Facebook page in 2017: “We openly declare that absolutely, our country has no Rohingya race.” He went on to seize power in a coup in February 2021.

In July 2022, the International Court of Justice (ICJ) ruled that it has jurisdiction to proceed with a case against the Myanmar government under the Genocide Convention based on Myanmar’s treatment of the Rohingya. Amnesty International welcomes this vital step towards holding the Myanmar government to account and continues to call for senior members of the Myanmar military to be brought to justice for their role in crimes against the Rohingya.

In 2014, Meta attempted to support an anti-hate initiative known as ‘Panzagar’ or ‘flower speech’ by creating a sticker pack for Facebook users to post in response to content which advocated violence or discrimination. The stickers bore messages such as, ‘Think before you share’ and ‘Don’t be the cause of violence’.

However, activists soon noticed that the stickers were having unintended consequences. Facebook’s algorithms interpreted the use of these stickers as a sign that people were enjoying a post and began promoting them. Instead of diminishing the number of people who saw a post advocating hatred, the stickers actually made the posts more visible.

The UN’s Independent International Fact-Finding Mission on Myanmar ultimately concluded that the “role of social media [was] significant” in the atrocities in a country where “Facebook is the Internet”.

Mohamed Showife, a Rohingya activist, said: “The Rohingya just dream of living in the same way as other people in this world… but you, Facebook, you destroyed our dream.”

The Rohingya just dream of living in the same way as other people in this world…but you, Facebook, you destroyed our dream.

Mohamed Showife, Rohingya community member

Facebook’s failure to act

The report details how Meta repeatedly failed to conduct appropriate human rights due diligence on its operations in Myanmar, despite its responsibility under international standards to do so.

Internal studies dating back to 2012 indicated that Meta knew its algorithms could result in serious real-world harms. In 2016, Meta’s own research clearly acknowledged that “our recommendation systems grow the problem” of extremism.

Meta received repeated communications and visits by local civil society activists between 2012 and 2017 when the company was warned that it risked contributing to extreme violence. In 2014, the Myanmar authorities even temporarily blocked Facebook because of the platform’s role in triggering an outbreak of ethnic violence in Mandalay. However, Meta repeatedly failed to heed the warnings, and also consistently failed to enforce its own policies on hate speech.

Amnesty International’s investigation includes analysis of new evidence from the ‘Facebook Papers’ – a cache of internal documents leaked by whistleblower Frances Haugen.

In one internal document dated August 2019, one Meta employee wrote: “We have evidence from a variety of sources that hate speech, divisive political speech, and misinformation on Facebook… are affecting societies around the world. We also have compelling evidence that our core product mechanics, such as virality, recommendations, and optimizing for engagement, are a significant part of why these types of speech flourish on the platform.”

Our recommendation systems grow the problem

Meta report, 2016

‘Meta must pay’

Amnesty International is today launching a new campaign calling for Meta Platforms, Inc. to meet the Rohingya’s demands for remediation.

Today marks the first anniversary of the murder of prominent activist Mohib Ullah, chair of the Arakan Rohingya Society for Peace and Human Rights. Mohib was at the forefront of community efforts to hold Meta accountable.

Rohingya refugee groups have made direct requests to Meta to provide remedy by funding a USD $1 million education project in the refugee camp in Cox’s Bazar, Bangladesh. The funding request represents just 0.002% of Meta’s profits of $46.7 billion from 2021. In February 2021, Meta rejected the Rohingya community’s request, stating: “Facebook doesn’t directly engage in philanthropic activities.”

Showkutara, a 22-year-old Rohingya woman and youth activist, told Amnesty International: “Facebook must pay. If they do not, we will go to every court in the world. We will never give up in our struggle.”

There are at least three active complaints seeking remediation for the Rohingya from Meta. Civil legal proceedings were filed against the company in December 2021 in both the United Kingdom and the USA. Rohingya refugee youth groups have also filed an OECD case against Meta which is currently under consideration by the US’ OECD National Contact Point.

“Meta has a responsibility under international human rights standards to remediate the terrible harm suffered by the Rohingya that they contributed to. The findings should raise the alarm that Meta risks contributing to further serious human rights abuses, unless it makes fundamental changes to its business model and algorithms,” said Agnès Callamard.

“Urgent, wide-ranging reforms to their algorithmic systems to prevent abuses and increase transparency are desperately needed to ensure that Meta’s history with the Rohingya does not repeat itself elsewhere in the world, especially where ethnic violence is simmering.”

“Ultimately, States must now help to protect human rights by introducing and enforcing effective legislation to rein in surveillance-based business models across the technology sector. Big Tech has proven itself incapable of doing so when it has such enormous profits at stake.”

On 20 May 2022, Amnesty International wrote to Meta regarding the company’s actions in relation to its business activities in Myanmar before and during the 2017 atrocities. Meta responded that it could not provide information concerning the period leading up to 2017 because the company is “currently engaged in litigation proceedings in relation to related matters”.

On 14 June 2022, Amnesty International again wrote to Meta regarding the relevant allegations contained in the report, and to give the company the opportunity to respond. Meta declined to comment.

Facebook must pay. If they do not, we will go to every court in the world. We will never give up on our struggle.

Showkutara, Rohingya community member

Call on meta to provide reparations to the Rohingya community

A refugee community is taking on a Silicon Valley tech giant. Ethnic Rohingya groups are calling for reparations for Facebook’s role in the atrocities in Myanmar. Sign the petition calling on Meta to take responsibility now.

 


Tags: Myanmar, Facebook, Rohingya, reparations.

Share