Facebook and its founder Mark Zuckerberg have recently been in the limelight with the emergence of the Cambridge Analytica scandal. Further, Facebook has been at the center of attention worldwide as lawmakers and private citizens attempt to make sense of the company’s privacy rules and its responsibilities toward its users. In the United States, Facebook is under scrutiny for influencing the presidential election, while European governments (France, Germany, Ireland, the UK) are trying their best to prevent similar interference in their elections and referendums. Some debate the extent to which Facebook can influence people’s decisions based on what they see in their feed, but the question is crucial considering the real-world impact these decisions can have. We just have to think about the case of Brexit, the election of Donald Trump, or the referendum on abortion in Ireland to make this point.
The power of social media and tech companies to influence or meddle in local politics is not a solely Western problem. Cambridge Analytica, for example, attempted to influence the Nigerian elections in 2015 by using a video falsely portraying one of the candidates as a violent supporter of Sharia law. In other regions, social media can become a matter of life and death. The case of Myanmar in Southeast Asia illustrates that social media giants must consider their human rights and humanitarian impact.
The persecution of the Rohingya in Myanmar has recently grabbed international attention. The Muslim minority has faced discrimination for decades. The Rohingya are not recognized as one of Myanmar’s official 135 ethnic groups, and the lack of citizenship ultimately leaves them stateless. This has severely restricted their right to study, work, travel, and their access to health care. Furthermore, several violent crackdowns by the Burmese government since the 1970s have forced Rohingyas to flee to neighboring Bangladesh. Myanmar’s gradual transition from a military dictatorship to a civilian democracy in 2010 has actually led to a worsening of the situation as Burmese authorities attempt to maintain some form of power. The authorities portray the Rohingya as a threat to the Burmese population, resulting in an increase of violent attacks against them. Since August 2017, there have been widespread reports of killings, sexual violence against women and girls, and destruction of livelihoods, which led the UN and several governments to conclude that ethnic cleansing is taking place in the country.
So, what role does Facebook play in all of this? Although internet penetration is still rather low in Myanmar, Facebook has become the main information and communication tool for many people. Indeed, UN official Marzuki Darusman recently stated that “as far as the Myanmar situation is concerned, social media is Facebook, and Facebook is social media.” Instead of only connecting people, however, social media is also being used to spread false information and hate speech against religious minority groups. Facebook Messenger has frequently been used by the Burmese military and extremists to organize mobs and spread false information to the public, including videos supposedly portraying Rohingya attacking Burmese people.
The situation is serious. Following Mark Zuckerberg’s interview with Vox, six civil society groups in Myanmar sent a letter to the Facebook CEO, providing examples of what had been written in Facebook Messenger, and explaining how they had led to violent incidents. They contend that “the risk of Facebook content sparking open violence is arguably nowhere higher right now than in Myanmar” and express their disappointment over Zuckerberg’s lack of understanding while urging him to take action.
The use of media as a tool for mobilization and incitement of violence and hate is not new. Controlling the media has historically been a prerequisite for organized mass violence. In the years preceding genocide, newspapers and radio stations often become an outlet for hate speech and propaganda by the dominant regime. For example, Der Stürmer was well known for the printing of anti-semitic propoganda prior to the Holocaust, the sort of hate speech which Joseph Goebbels continued with the creation of his paper Der Angriff. Another well-known and more recent example is Radio Télévision Libre des Milles Collines, RTLM, the radio station that played a big role in instigating violence and spreading hate speech ahead of, and during, the Rwandan genocide against the Tutsis. The station was secretly supported by the government and managed to broadcast for a long time. After the genocide, three media executives who ran the radio station were put on trial and found guilty of genocide and incitement of genocide. This was the first case concerning media brought to an international court setting since the founder of Der Stürmer, Julius Streicher, was charged with genocide for publishing anti-Semitic propaganda. In the Rwandan case, the International Criminal Court concluded that “the power of the media to create and destroy fundamental human values comes with great responsibility” and that “those who control such media are accountable for its consequences.” The statement suggests that not only the author of hateful content stands responsible for its effects, but also the platform controlling and spreading that content.
But we are in a different era now. First, social media has a much wider reach than traditional media outlets. It can amplify hateful messages and spread false information much faster. If governments or extremist groups want to spread propaganda and incite violence, social media has become an easy tool, one that does not require printing and broadcasting. This is not helped by the structure of the Facebook newsfeed, which promotes the spread of sensational, and often false, information over more modest news stories. Second, platforms are owned by private companies that have no clue about local contexts because they sit miles away in Silicon Valley.
Mark Zuckerberg did not expect his tool to be used by extremists such as the Islamic State and Buddhist fundamentalists. “I don't know that it's possible to know every issue that you're going to face down the road,” he stated during a CNN interview. He did not think that it could be used to incite violence, recruit fighters, or exacerbate existing ethno-nationalist tensions. Many of us did not, even though we should have expected this media technology to have a dark side. But suddenly, Zuckerberg and other social media giants have a big dilemma on their hands. Social media companies are realizing that countries, such as Myanmar, are not simply faceless markets where products can simply be launched, but societies that have a history, and culture, and where communication tools have a real impact.
Zuckerberg’s response to the situation in Myanmar on Vox angered the writers of the letter. “In your interview, you refer to your detection ‘systems.’ We believe your system, in this case, was us — and we were far from systematic,” the letter states. Zuckerberg appears detached in his response to the activists, referring to Facebook’s use of artificial intelligence and the fact that the company would follow the activists’ suggestion to hire more local staff and collaborate with local organizations. The mere fact that the company lacked Burmese language speakers is quite astonishing. Zuckerberg’s response was naive, showing a lack of understanding, if not a lack of empathy, of the impact media can have on societies.
The argument that Myanmar is not Facebook’s business is unacceptable. This is a question of corporate and moral responsibility. Of course, Facebook and other social media companies did not initiate the violence that has erupted in Myanmar. The ethno-nationalist tensions and religious intolerance existed before the Internet. But these companies can do something to prevent more violence from taking place. During the Rwandan genocide, the international community could have jammed radio stations to prevent them from spreading hate. Social media companies are in the same situation now, even though their task is increasingly difficult due to the rapid and far-reaching spread of information on these platforms. Facebook’s legal responsibility is difficult to determine, because social media is still new. But its moral responsibility (for what it’s worth) is clear. As recently stated by the UN, Facebook is responsible for driving the violence and has played a “determining role” in the spread of hate speech. Consequently, the company cannot remain idle and ignorant as the Rohingya are being killed and displaced. As ethno-nationalism, racism, and xenophobia becomes more visible around the world, Myanmar could be just one of the many regions where social media become lethal.
If you liked this article, please consider becoming a Patron and contributing to the work we do here at The Mantle.
Human Rights, Hate Speech, Genocide