Months after reports first drew attention to the way Facebook has been used to incite ethnic hatred in Myanmar, the company announced it has banned several Myanmar military officials in an effort to “prevent the spread of hate and misinformation.”
The latest move comes days after more damning evidence that the social networking platform is still chock-full of “posts, comments, and pornographic images attacking the Rohingya and other Muslims,” according to an investigation by Reuters.
Once again, Facebook officials offered a mea culpa as they described attempts to curb the abuse that has led to deadly violence in the country.
“While we were too slow to act, we’re now making progress — with better technology to identify hate speech, improved reporting tools, and more people to review content,” the company wrote. “Today, we are taking more action in Myanmar: We’re removing a total of 18 Facebook accounts, one Instagram account, and 52 Facebook Pages, followed by almost 12 million people. We are preserving data on the accounts and Pages we have removed.”
In the announcement, Facebook said it was banning 20 individuals and organizations in Myanmar, including a group that includes Senior General Min Aung Hlaing, commander-in-chief of the armed forces, and the military’s Myawady television network.
“International experts, most recently in a report by the UN Human Rights Council-authorized fact-finding mission on Myanmar, have found evidence that many of these individuals and organizations committed or enabled serious human rights abuses in the country,” the company wrote. “And we want to prevent them from using our service to further inflame ethnic and religious tensions.”
Altogether, Facebook said it removed six Pages and six accounts, plus one account from Instagram, that were linked to these individuals or organizations. It also took down another 46 Pages and 12 accounts “for engaging in coordinated inauthentic behavior on Facebook.”
Amid ongoing controversy around how Facebook lost control of its platform, the situation in Myanmar has emerged as one of the social media giant‘s biggest failings. Back in the spring, with the country facing a massive humanitarian crisis that forced 700,000 members of the Rohingya minority community to flee ethnic violence, the United Nations blasted Facebook for its role in spreading fake news and propaganda in the country.
Facebook CEO Mark Zuckerberg said the company was hiring more Burmese speakers to monitor content. But on August 15, Reuters revealed that the platform still had more than 1,000 posts attacking the Muslim group. That same day, Facebook officials announced new steps to fight the problem and insisted the company was making progress.
“The ethnic violence in Myanmar is horrific, and we have been too slow to prevent misinformation and hate on Facebook,” the company wrote. “It’s why we created a dedicated team across product, engineering, and policy to work on issues specific to Myanmar earlier this year. Today we’re sharing details on the investments we have made and the results they have started to yield.”
In today‘s update, the company insisted it is still making progress, but admitted that the problem remains challenging.
“We continue to work to prevent the misuse of Facebook in Myanmar — including through the independent human rights impact assessment we commissioned earlier in the year,” the company wrote. “This is a huge responsibility, given so many people there rely on Facebook for information — more so than in almost any other country, given the nascent state of the news media and the recent rapid adoption of mobile phones. It’s why we’re so determined to do better in the future.”