Facebook fails to recognize hate against Rohingya with 'Kill More'

The Associated Press received a report that showed that Global Witness submitted eight paid advertisements to Facebook for approval. Each ad included hate speech against Rohingya. Facebook approved all eight ads for publication.

The group removed the ads before they could be posted or paid for. However, the results showed that Facebook’s leaky controls continue to fail to detect hate speech or calls for violence, despite promises to improve.

After an attack by a Rohingya rebel group, the army launched what it called a clearing campaign in Rakhine state, western Myanmar. Over 700,000 Rohingya fled to Bangladesh, where security forces were accused in mass rapes, murders and torching thousands homes.

Monday also saw U.S. Secretary Antony Blinken announce that the U.S. considers the violence against Rohingya genocide. This declaration was made to generate international pressure as well as lay the foundation for possible legal action.

Myanmar’s military took over the country on February 1, last year and jailed democratically elected officials. The military takeover was condemned by Rohingya refugees who claim it makes them less afraid to return to Myanmar.

Experts claim that such ads continue to appear, and that Facebook fails to fulfill even the most basic of tests — making sure that paid ads on its site don’t contain hate speech calling to the murder of Rohingya Muslims – despite promising to improve and promising to do better.

Global Witness proposes a paid post, “The killing of the Kalar isn’t enough,” using a term often used in Myanmar to describe people of east Indian and Muslim heritage.

They are extremely dirty. Bengali/Rohingya women live in very poor conditions and have poor hygiene. They are not attractive,” says another.

Ronan Lee, a researcher at Loughborough University’s Institute for Media and Creative Industries, said that these posts are “shocking in their content and are a clear indication that Facebook hasn’t changed or done what it promised the public: properly regulate themselves.”

Global Witness’ eight ad campaigns used hate speech language directly taken from the report of the United Nations Independent International Fact-Finding Mission on Myanmar. Many examples are from Facebook posts in the past.

Facebook’s approval of all eight ads is particularly concerning as the company claims that advertisements are held to a “even stricter standard” than regular, unpaid posts according to its help center pagefor Paid Advertisements.

“I understand that eight is a small number. Rosie Sharpe, a Global Witness campaigner, said that the results are quite stark. “All eight ads were accepted for publication.” “I believe you can draw that the vast majority of hate speech will be heard.”

Meta Platforms Inc., Facebook’s parent company, stated that it had invested in Myanmar to improve safety and security. This includes banning military accounts in Myanmar after the Tatmadaw (as the armed forces locally know) seized power and imprisoned elected officials in the 2021 coup.

“We have created a dedicated team made up of Burmese speakers, banned Tatmadaw, disrupted networks manipulating public discussion and taken action against harmful misinformation to keep people safe. In order to lessen the incidence of violating content, we’ve also invested into Burmese language technology,” Rafael Frankel (director of public policy for emerging market at Meta Asia Pacific) wrote in an emailed statement to AP on March 17. “This work is guided and guided by feedback from experts and civil society organizations, as well as independent reports such the UN Fact-Finding Mission’s findings on Myanmar and the independent Human Rights Impact Assessment that we commissioned and published in 2018.

Facebook has been used in the past to spread hate speech and amplify military propaganda in Myanmar.

Soon after Myanmar was connected to the internet, Facebook partnered with its telecom providers to enable customers to use the platform without paying for data. The platform was widely used. Facebook was the internet for many people in Myanmar.

Local internet policy advocates repeated the claim that Facebook was spreading hate speech, often targeting the Muslim Rohingya minority in the majority Buddhist nation.

Tun Khin, the president of Burmese Rohingya Organization UK (a London-based Rohingya advocacy group), stated that Facebook has not invested in content moderators or fact checkers who speak local languages.

Marzuki Darusman was appointed chairman of U.N. in March 2018, six months after thousands fled violence in western Myanmar. Independent International Fact-Finding Mission on Myanmar stated that social media had “substantively contributed” to the public’s acrimony, dissension, and conflict.

“Hate speech is definitely a part of that. Darusman stated that social media is Facebook and Facebook is Facebook in the Myanmar context.

When asked about Myanmar one month later, Meta CEO Mark Zuckerberg stated that Facebook would hire “dozens” Burmese speakers to moderate content. He also said that Facebook would collaborate with civil society groups to identify hate leaders and create new technologies to combat hate speech.

“Hate speech is very specific to a particular language. Zuckerberg stated that it’s difficult to achieve this without speaking the local language. We need to increase our efforts there significantly.

Frances Haugen, a whistleblower, leaked internal documents last year and revealed that violations continued. Although the company made greater efforts to combat hate speech, it never developed the strategies and tools necessary.

Facebook has been sued by Rohingya refugees over more than $150 billion. They claim it failed to stop hate speech which incited violence against the Muslim group by military rulers in Myanmar and their supporters. The 38-nation Organization for Economic Cooperation and Development (OEC&D) has received a separate complaint from Rohingya youth organizations based in Bangladesh refugee camps. It requests that Facebook implement remediation programs in these camps.

Meta, the company now known as Meta, has not said how many content moderators are proficient in Burmese to detect hate speech from Myanmar.

Tun Khin stated that “Rohingya genocide survivors still live in camps today, and Facebook continues to fail them.” “Facebook needs to do more.”

 

Exit mobile version