October 28, 2017

A WAR OF WORDS PUTS FACEBOOK AT THE CENTER OF MYANMAR’S ROHINGYA CRISIS

[In Myanmar, Facebook is so dominant that to many people it is the internet itself. And the stakes of what appears on the site are exceptionally high because misinformation, as well as explicitly hostile language, is widening longstanding ethnic divides and stoking the violence against the Rohingya ethnic group.]


By Megan Specia and Paul Mozur

Ashin Wirathu in 2013. He has been barred from public preaching in Myanmar
since March. Credit Adam Dean for The New York Times
Myanmar’s government has barred Ashin Wirathu, an ultranationalist Buddhist monk, from public preaching for the past year, saying his speeches helped fuel the violence against the country’s Rohingya ethnic group that the United Nations calls ethnic cleansing.

So he has turned to an even more powerful and ubiquitous platform to get his message out — Facebook.

Every day he posts updates, often containing false information, that spread a narrative of the Rohingya as aggressive outsiders. And posts like these have put Facebook at the center of a fierce information war that is contributing to the crisis involving the minority group. International human rights groups say Facebook should be doing more to prevent the hateful speech, focusing as much on global human rights as on its business.

“Facebook is quick on taking down swastikas, but then they don’t get to Wirathu’s hate speech where he’s saying Muslims are dogs,” said Phil Robertson, deputy director of Human Rights Watch’s Asia division.

Across the world, Facebook and other social platforms are being questioned about their expanding role and responsibilities as publishers of information. In Britain, investigations have begun into the spread of misinformation on social media about the European Union membership referendum. In the United States, lawmakers are looking into Russian efforts to influence the 2016 presidential election on social media platforms.

In Myanmar, Facebook is so dominant that to many people it is the internet itself. And the stakes of what appears on the site are exceptionally high because misinformation, as well as explicitly hostile language, is widening longstanding ethnic divides and stoking the violence against the Rohingya ethnic group.

For example, since the government crackdown against the Rohingya began, Zaw Htay, a spokesman for the country’s de facto leader, Daw Aung San Suu Kyi, has shared dozens of posts on his Facebook page and Twitter account that include images said to show Rohingya burning their own homes. Many of these images have been debunked, yet they still stand.

First-person accounts from Rakhine State establish a coordinated crackdown against the Rohingya minority by the military and by ultranationalist groups, driving more than 600,000 refugees across the border into Bangladesh.

Facebook does not police the billions of posts and status updates that flow through the site worldwide each day, relying instead on an oftentimes confusing set of “community standards” and reports by users of direct threats that are then manually assessed and, in some cases, removed.

After the 2016 United States elections, Facebook rolled out a set of guidelines to help users identify fake news and misinformation. The company does not regularly remove misinformation itself.

Facebook has no office in Myanmar, but the company has worked with local partners to introduce a Burmese-language illustrated copy of its platform standards and will “continue to refine” its practices, said a spokeswoman, Clare Wareing, in an emailed statement.

Human rights groups say the company’s approach has allowed opinion, facts and misinformation to mingle on Facebook, clouding perceptions of truth and propaganda in a country where mobile technology has been widely adopted only in the past three years.

Under the rule of the military junta, strict censorship regulations deliberately made SIM cards for cellphones unaffordable to control the free flow of information. In 2014, restrictions loosened and the use of mobile technology exploded as SIM cards became affordable. Facebook users ballooned from about two million in 2014 to more than 30 million today. But most users do not know how to navigate the wider internet.

“Facebook has become sort of the de facto internet for Myanmar,” said Jes Kaliebe Petersen, chief executive of Phandeeyar, Myanmar’s leading technology hub that helped Facebook create its Burmese-language community standards page. “When people buy their first smartphone, it just comes preinstalled.”

Mr. Petersen said local media and news outlets should help combat misinformation in a technology sector still in its infancy.

“There are still some challenges here, and there are of course very big differences between big cities and rural communities,” he said. “I think it’s really important that people focus on educating this new generation of digital users.”

In the meantime, Facebook has become a breeding ground for hate speech and virulent posts about the Rohingya. And because of Facebook’s design, posts that are shared and liked more frequently get more prominent placement in feeds, favoring highly partisan content in timelines.

Ashin Wirathu, the monk, has hundreds of thousands of followers on Facebook accounts in Burmese and English. His posts include graphic photos and videos of decaying bodies that Ashin Wirathu says are Buddhist victims of Rohingya attacks, or posts denouncing the minority ethnic group or updates that identify them falsely as “Bengali” foreigners.

Facebook has removed some of his posts and restricted his page for stretches, but it is currently active. In an interview, Ashin Wirathu said that if Facebook did remove his account, he would simply create a new one. He added that if anyone did not like his Facebook posts, “they can sue me.”

Posts from verified government and military Facebook accounts also carry misinformation. Some, for example, refuse to acknowledge the Rohingya as an ethnic group deserving of citizenship rights, despite the fact many have lived in Rakhine State for generations.

Gen. Min Aung Hlaing, the commander in chief of Myanmar’s armed forces who has carried out the crackdown on the Rohingya, has more than 1.3 million users on his verified account. A post from Sept. 15 describes the operation as a response to an “attempt of extremist Bengalis in Rakhine State to build a stronghold,” after an Aug. 25 attack on remote border posts by a Rohingya militant group.

Rohingya activists also use Facebook, documenting human rights abuses, often with graphic images and videos as evidence. Sometimes the company has taken these down.

Ms. Wareing, the Facebook spokeswoman, said the company removed graphic content “when it is shared to celebrate the violence.” She said the company would allow graphic content if it was newsworthy, significant or important to the public interest, even if it might otherwise go against the platform’s standards.

Richard Weir, an Asia analyst with Human Rights Watch, said the situation was complicated.

“It’s a really delicate balance here between things that are violent and posted by people who would seek to inflame tensions and those that are trying to disseminate information,” Mr. Weir said. “It’s difficult to know where exactly to draw the line.”

Some of the social media conversation is happening privately. For instance, chain messages on Facebook Messenger before Sept. 11 this year falsely warned of a planned Rohingya attack against Buddhists. Written like a chain letter, the message called for people to share it, and many people were put on edge as it spread.

“I was nervous about it,” said U Tin Win, a teacher from Mandalay, the country’s second-largest city, who received the letter. “I don’t know who started the message, but I ordered my family not to go outside that day.”

Mr. Weir said that people in Myanmar relied on social media for their news.

“The government can sort of trot out its own views and spread them very rapidly, in addition to a bunch of other nonstate entities,” he said. “Views about people in Rakhine State, about the origins of population and about things that may or may not have happened fly around Facebook extremely quickly and can create unstable situations.”


Megan Specia reported from New York, and Paul Mozur from Shanghai. Mike Isaac contributed reporting from San Francisco, and Saw Nang from Mandalay, Myanmar.