Lunes, 28 de agosto, 2023

Meta should immediately pay reparations to the Rohingya for the role that Facebook played in the ethnic cleansing of the persecuted minority group, Amnesty International said today, on the sixth anniversary of the Myanmar military’s brutal operation during which they raped Rohingya women and girls, burned down entire villages, and killed thousands.

Facebook’s algorithms and Meta’s ruthless pursuit of profit created an echo chamber that helped foment hatred of the Rohingya people and contributed to the conditions which forced the ethnic group to flee Myanmar en masse.

Although this stands out as one of the most egregious examples of a social media company’s involvement in a human rights crisis, the Rohingya are still awaiting reparations from Meta.

Pat de Brún, Head of Big Tech Accountability at Amnesty International

“Six years have gone by since Meta contributed to the terrible atrocities perpetrated against the Rohingya people. Yet although this stands out as one of the most egregious examples of a social media company’s involvement in a human rights crisis, the Rohingya are still awaiting reparations from Meta,” said Pat de Brún, Head of Big Tech Accountability at Amnesty International.

“Our investigations have made it clear that Facebook’s dangerous algorithms, which are hard-wired to drive “engagement” and corporate profits at all costs, actively fanned the flames of hate and contributed to mass violence as well as the forced displacement of over half the Rohingya population of Myanmar into neighbouring Bangladesh.

“It is high time Meta faced its responsibilities by paying reparations to the Rohingya and by fixing its business model to prevent this from happening again.”

Coincidentally, 25th August also marks an important step in holding Big Tech to account for its human rights impacts as it is when key provisions of the Digital Services Act (DSA) come into force for major online platforms in the European Union. The DSA is a landmark piece of legislation aimed at strengthening rights in the digital age, which could create ripple effects far beyond the EU. 

Some people stand with their backs to the camera, looking out onto a large refugee camp.
© Maung Sawyeddollah
Cox’s Bazar refugee camp.

A personal plea to Meta and Mark Zuckerberg

Today, Amnesty International and Al Jazeera publish a searing first-person account by Rohingya refugee Maung Sawyeddollah, who was forced to flee his village in Myanmar when he was just a teenager. He fled through torched villages and fields filled with dead bodies and now lives in the world’s biggest refugee camp, Cox’s Bazar in Bangladesh, with around a million of his people.

I’d like to meet Mark Zuckerberg and his team. Maybe they’d like to come and spend a night or two in the refugee camp?

Maung Sawyeddollah

As a child, before the hate took root with the help of Facebook, he and his mostly Muslim Rohingya friends played happily with the mostly Buddhist Rakhine children from the neighbouring village — but that all changed when the military moved in.

“I’d like to meet Mark Zuckerberg and his team. Maybe they’d like to come and spend a night or two in the refugee camp?”, Sawyedollah writes. “I’d tell them: ‘Can’t you see your role in our suffering? We asked you, repeatedly, to try and help make things better for us… Yet you ignore our pleas. Tell me, do you feel anything for us? Is it only about the data, is it only about the dollars?”

Background

Last year, Amnesty International published a report detailing Meta’s role in the atrocities committed against the Rohingya people by the Myanmar military in 2017. It revealed that even Facebook’s internal studies dating back to 2012 indicated that Meta knew its algorithms could result in serious real-world harms. In 2016, Meta’s own research clearly acknowledged that “our recommendation systems grow the problem” of extremism.

Beginning in August 2017, the Myanmar security forces undertook a brutal campaign of ethnic cleansing against Rohingya Muslims in Myanmar’s Rakhine State. They unlawfully killed thousands of Rohingya, including young children; raped and committed other sexual violence against Rohingya women and girls; tortured Rohingya men and boys in detention sites; and burned down hundreds of Rohingya villages. The violence pushed over 700,000 Rohingya — more than half the Rohingya population living in northern Rakhine State at the beginning of the crisis — into neighbouring Bangladesh.   

Meta contributed to serious adverse human rights impacts suffered by the Rohingya in the context of the 2017 atrocities in Rakhine State and therefore has a responsibility under international human rights standards to provide an effective remedy to the community. This includes making necessary changes to its business model which can ensure this never happens again. All companies have a responsibility to respect all human rights wherever they operate in the world and throughout their operations. This is a widely recognized standard of expected conduct as set out in international business and human rights standards, including the UN Guiding Principles on Business and Human Rights (UN Guiding Principles) and the OECD Guidelines for Multinational Enterprises (OECD Guidelines).