With roosters crowing in the background as he speaks from the crowded refugee camp in Bangladesh that’s been his home since 2017, Maung Sawyeddollah, 21, describes what happened when violent hate speech and disinformation targeting the Rohingya minority in Myanmar began to spread on Facebook.
” We were good with most people there. He said that some narrow-minded and nationalist people spread hate against Rohingya via Facebook. “And the people who were kind, in close contact with Rohingya. They changed their minds about Rohingya and it became hateful.” For years, Meta Platforms Inc. (now Facebook) promoted the narrative that it was a neutral platform that was used by malicious people in Myanmar. It also claimed that it failed to remove hateful and violent material despite its best efforts. That narrative echoes its response to the role it has played in other conflicts around the world, whether the 2020 election in the U.S. or hate speech in India. But a new comprehensive report by Amnesty International shows that Facebook’s preferred narrative has been discredited. Amnesty International claims that the platform wasn’t just a passive site with limited content moderation. Instead, Meta’s algorithms “proactively amplified content” on Facebook. This promoted hatred against the Rohingya as early as 2012..
Despite years of warnings from Amnesty, the company failed to remove hate speech and disinformation about the Rohingya and actively propagated it until it culminated with the 2017 massacre. This coincided with the growing popularity of Facebook in Myanmar, where it was the only way to connect to the internet. This effectively made Facebook the internet for a large portion of Myanmar’s population.
More than 700,000 Rohingya fled into neighboring Bangladesh that year. Myanmar security forces were accused in mass rapes, murders and torching thousands homes owned by Rohingya.
“Meta, with its dangerous algorithms and relentless pursuit of profit, substantially contributed to the serious human right violations against the Rohingya,” says the report. A spokesperson for Meta declined to respond to questions regarding the Amnesty report. In a statement, the company said it “stands in solidarity with the international community and supports efforts to hold the Tatmadaw accountable for its crimes against the Rohingya people.”
“Our safety and integrity work in Myanmar remains guided by feedback from local civil society organizations and international institutions, including the U.N. Fact-Finding Mission on Myanmar; the Human Rights Impact Assessment we commissioned in 2018; as well as our ongoing human rights risk management,” Rafael Frankel, director of public policy for emerging markets, Meta Asia-Pacific, said in a statement.
Like Sawyeddollah, who is quoted in the Amnesty report and spoke with the AP on Tuesday, most of the people who fled Myanmar — about 80% of the Rohingya living in Myanmar’s western state of Rakhine at the time — are still staying in refugee camps. They are asking Meta for reparations for its role as a genocide victim in the brutal repression of Rohingya Muslims, Myanmar, earlier this year.
Amnesty’s report, out Wednesday, is based on interviews with Rohingya refugees, former Meta staff, academics, activists and others. It also relied upon documents that Frances Haugen, a whistleblower and former Facebook data scientist, disclosed to Congress last January. It noted that Meta has been improving its civil society engagement and certain aspects of its content moderation in Myanmar over the past years, according to digital rights activists. After a violent coup, the platform banned the military of the country from its platform in January 2021,.
But, critics, including some Facebook employees, have long maintained that such an approach will not work. Meta is trying to remove harmful content while its algorithms designed for pushing “engaging” content that’s more likely get people riled up actually work against it.
” These algorithms are dangerous to our human rights. “What happened to the Rohingya? And Facebook’s role is in that particular conflict risks happening again, across many different contexts around the world,” said Pat de Brun (Amnesty researcher and advisor on artificial intelligence and human right).
“The company has shown itself completely unwilling or incapable of resolving the root causes of its human rights impact.”
After the U.N.’s Independent International Fact-Finding Mission on Myanmar highlighted the “significant” role Facebook played in the atrocities perpetrated against the Rohingya, Meta admitted in 2018 that “we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence.”
In the following years, the company “touted certain improvements in its community engagement and content moderation practices in Myanmar,” Amnesty said, adding that its report “finds that these measures have proven wholly inadequate.”
In 2020, for instance, three years after the violence in Myanmar killed thousands of Rohingya Muslims and displaced 700,000 more, Facebook investigated how a video by a leading anti-Rohingya hate figure, U Wirathu, was circulating on its site.
The probe revealed that over 70% of the video’s views came from “chaining” — that is, it was suggested to people who played a different video, showing what’s “up next.” The video was not found by Facebook users, but rather had been provided to them by the platform’s algorithms.
Wirathu was banned from Facebook in 2018.
“A well-resourced approach in content moderation would not have been enough to prevent or mitigate these algorithmic harms. According to Amnesty’s report, content moderation does not address the root cause Meta’s algorithmic amplifying of harmful content.
The Rohingya refugee are seeking unspecified damages from Menlo Park’s social media giant, which is based in California, for its role as a facilitator of genocide. Meta, which is the subject of twin lawsuits in the U.S. and the U.K. seeking $150 billion for Rohingya refugees, has so far refused.
“We believe the genocide against Rohingya could only have been possible because of Facebook,” Sawyeddollah stated. “They communicated with one another to spread hatred, and they organized campaigns via Facebook. But Facebook was silent.”
I have been writing professionally for over 20 years and have a deep understanding of the psychological and emotional elements that affect people. I’m an experienced ghostwriter and editor, as well as an award-winning author of five novels.