Facebook Responds to Rohingya Genocide

Since August 2017, in what the UN has called “a textbook example of ethnic cleansing,” hundreds of thousands of Rohingya risked everything to flee persecution from the northern Rakhine province of Myanmar to neighboring country, Bangladesh.
Various minority groups make up close to a third of Myanmar’s population, one of which, are the Rohingya Muslims. The Rohingya Muslims comprise the largest population of Muslims in the country. For generations, the Rohingya have curated their own culture and maintain their own language. Yet, Myanmar, a predominantly Buddhist country, refuses to grant the Rohingya citizenship status. As a result, they have been condemned to live as “illegal citizens” in their own country.
Recently, the Myanmar government and police force, in an effort supposedly targeting the Rohingya militants, launched a myriad of deadly attacks, leaving over 6,700 Rohingya citizens dead. Although, the total death count over this past year alone reaches about 25,000. The horrific crimes against the Rohingya people include rape, pillaging and burning of villages, countless deadly attacks, and now genocide, crimes against humanity, and war crimes, according to the UN in a recently published report. The UN also cited the continued and extreme brutality of the crimes against the Rohingya, the “hate rhetoric” by military commanders, policies of deliberate exclusion of the Rohingya people, and a “level of organization indicating a plan for destruction,” in their assessment on the severity of the Rohingya crisis. As a result of this growing disaster, over 700,000 fled Myanmar over this past year alone, leading the UN to label this as the fastest growing refugee crisis.
Shortly after this UN announcement, in an unconventionally prompt response, Facebook announcedit took down a total of 18 Facebook accounts, one Instagram account, and 52 pages of Myanmar military officials with over 12 million followers. Specifically, Facebook banned 20 individuals and organizations from using the site, including Senior General Min Aung Hlaing, commander-in-chief of the armed forces, and the military’s Myawaddy television network. Facebook also removed 46 pages and 12 accounts for “engaging in coordinated and inauthentic behavior on Facebook,” which Facebook claims were used to spread hate speech and fuel the growing tension.
Myanmar military officials' use of Facebook contributed to a number of these brutal acts of violence, such as allowing the perpetuation of false information, the promulgation of hate speech, promoting attendance for false attacks, and using seemingly independent news sources to push Myanmar military propaganda. Although the company has pledged their “responsibility to fight abuse on Facebook… [which] is especially true in countries like Myanmar,” Facebook admits that it was too slow to act. Before the deactivation of these pages, the company pledged to hire more Burmese speakers to monitor hate speech in Myanmar on their platform. While this response to the UN report shows progress, the issue persists. The Facebook account for the de facto leader of the Myanmar government, Aung San Suu Kyi, was not taken down, despite allegations that this, too, is a tool of spreading false information and sparking violence.
Combating hate speech and false information is a constant battle. Facebook, and other social media and technology companies have a long road ahead of them in protecting their platforms against these plagues. However, this swift action against perpetrators of such brutal crimes shows significant progress. Social media companies, especially one as large as Facebook, bare a commitment to maintain integrity in its information and safety for its users.
We will continue to work with Facebook on issues of hate speech and hate organizing on their platform. This most recent strategy gets us one step closer to a better internet space. – Muslim Public Affairs Council

 

 

Back to Pakistanlink Homepage

Editor: Akhtar M. Faruqui
© 2004 pakistanlink.com . All Rights Reserved.