Another day, another mea culpa from Facebook for failing to properly police its platform. This time, the social giant is admitting that it didn’t act quickly enough stop the spread of ethnic hate and violence across Myanmar.
“We were too slow to act,” Facebook stated Monday.
Trying to redeem itself, the company just removed 18 Facebook accounts, one Instagram account and 52 Facebook pages for fueling discord in the region. Together, these properties reached nearly 12 million people, by Facebook’s calculations.
Perhaps the most prominent figure losing his Facebook privileges is Senior General Min Aung Hlaing, commander-in-chief of the armed forces, while the military’s Myawady television network has also been banned from Facebook.
Among other expert resources, Facebook said the crackdown was guided by the U.N. Human Rights Council-authorized Fact-Finding Mission on Myanmar and its a report detailing serious human rights abuses in the country.
Amid reports of ethnic and religious genocide in Myanmar, human-rights officials have accused Facebook of having blood on its hands.
In March, Marzuki Darusman, chairman of the U.N. Independent International Fact-Finding Mission on Myanmar, said the tech titan played a “determining role” in the country. More recently, an investigation by Reuters found Facebook’s efforts to curb hate speech in the region were coming up short.
Reuters found more than 1,000 posts, comments, images and videos on the social network attacking the marginalized Rohingya people and other minorities in Myanmar.
On Monday, Facebook insisted it is developing better technology to identify hate speech, improve its reporting tools, and add additional human content reviewers.
While still producing questionable results, these efforts are costing Facebook significant resources. In fact, the company recently said it expects total expenses to grow by as much as 60% from last year to this year -- largely due to efforts to clean up its network.