Why Youtube’s Decision to Remove Far-Right Content is not Enough

As a growing source of radicalization, YouTube must do more to fight it.

Science and Tech

 

Mantle Image YouTube

The New York Times recently released an exposé detailing a growing source of online radicalization: YouTube. No longer limited to chatrooms hidden in the depths of the dark web, videos promoting far-right ideas are widely popularized and made accessible through the social media platform. With just the simple click of a button, viewers can find themselves cast into a world of far-right personalities. Like Facebook and Twitter, more needs to be done to address YouTube’s growing hotbed of right wing extremism.

 

Of particular concern is the ease with which viewers become ensnared in a vortex of far-right politics. Drawn to the persuasive allure of YouTube personalities, viewers are taught that feminism is a dangerous ideology, that Western civilization is being threatened by Muslim immigrants, and that Jews are the villains behind multiple conspiracy theories. According to a report by Data & Society titled Alternative Influence, these alt-right content creators adopt the techniques of brand influencers to effectively sell a political ideology. By connecting and interacting with other YouTube-based influencers, they move from the margins to the mainstream.

 

As stated by Rebecca Lewis, the researcher behind the report, understanding the spread of extremist political views is no longer limited to fringe communities, rather “it requires us to scrutinize polished, well-lit microcelebrities and the captivating videos that are easily available on the pages of the internet’s most popular video platform.”

 

Two important factors contribute to YouTube’s growing radical right-wing hub: its recommendation algorithm, the software that determines the following videos to appear on a user’s homepage, and the Up Next sidebar, which suggests more and more radical content to viewers to ensure they stay glued to their screens. As YouTube’s business model is dependent on the viralness of its videos, creators have an incentive to release even more provocative content that will generate a greater number of views and advertising dollars. The company provides, whether intentionally or not, an avenue for right-wing cult personalities to build media empires and keep their audience caught in a recommendation loop.

 

Due to backlash, YouTube decided recently to take down videos that broadcast Neo-Nazism, Holocaust denialism, and other xenophobic views. The company also promised to change its recommendation algorithm in order to prevent the spread of conspiracy theories. However, many videos continue to dodge YouTube’s Community Guidelines that ban hate speech and harassment, turning the removal of content into a never ending game of whack-a-mole. As seen with Facebook, relying on algorithm modification often fails to dispel the surge of online hate. The Washington Post revealed that videos featuring clips of Hitler’s speeches as an explanation of the origins of white-supremacist ideas were also removed. YouTube’s purge has taken the good out with the bad.

 

While radicalization is a complex phenomenon in which multiple factors can play a part, including emotional, socioeconomic and political elements, social media has the power to expand vulnerable individuals’ access to extremist views. YouTube boasts two billion monthly active users per month and over 500 videos uploaded per minute. According to the Pew Research Center, YouTube is now used by 94 percent of 18 to 24-year-olds in the United States, raising the probability of young adults coming across far-right extremist views.

 

In an era that is witnessing alarming levels of outward white supremacy, from Pittsburgh to Christchurch, we should be deeply troubled by the content available on YouTube.

 

Effectively resolving YouTube’s right extremist problem must go further than the removal of content. As Google CEO (YouTube’s parent company) Sundar Pichai said, “It’s also a hard societal problem because we need better frameworks around what is hate speech, what’s not, and how do we as a company make those decisions at scale and get it right without making mistakes.”

 

YouTube struggles to uphold its commitment to cause no harm against the arguments preaching in favor of freedom of speech and creative freedom. It has come under fire for stating that alt-right streaming star Steven Crowder’s repeated homophobic remarks towards a Vox News reporter did not violate the company’s hate speech policies. It was only after an in-flooding of protesting tweets that the company decided to reconsider its stance on Crowder and demonetize a portion of his videos. While YouTube may have anti-harassment and anti-hate speech rules, the company repeatedly fails to apply them in practice. It maintains an ad-hoc response to popular outcry over the presence of bigoted videos.

 

It should not be up to companies alone, though they have a large part to play, to address hate speech and extremist views on their platforms. A broader discussion of what constitutes borderline content, that is videos that are close to infringing YouTube’s Community Guidelines due to malicious or dangerous views, must take place outside of the company’s walls. YouTube has started to remove such videos from user’s recommendation, yet there is no safeguarding mechanism that prevents these videos from being uploaded and then released on the site in the first place.

 

This strategy demonstrates how the company is unable to take a strong stance in censoring far-right personalities, choosing to remove their audience instead of their videos. Whereas Islamist extremist content is more rigorously policed online, United States lawmakers are more reticent to silence far-right extremist white supremacist personalities and groups. We need stricter and more enforceable policies around what constitutes hate speech and how it ought to be banned.

 

In the United States, federal law continues to protect YouTube from being liable for the content found on its site. The European Union, on the other hand, has been more proactive and less hesitant in regulating the corporate conduct of tech companies. The European Parliament has recently passed the Copyright Directive, which seeks to remove copyrighted material from social media platforms. Earlier this year, the European Commission introduced the Code of Conduct on countering illegal hate speech online, in which YouTube, along with Facebook and Twitter, agreed to review requests to remove content in under 24 hours.

Another part of the issue is the sheer size of YouTube. As with other tech giants, namely Facebook, YouTube has become too big to effectively monitor its content and toxic comments section, which has been called the worst on the internet. YouTube has expanded so rapidly that the company has either failed to, or did not prioritize, preventing people from using the platform to spew hate speech. As signaled by United States Senator Elizabeth Warren, these companies ought to be broken apart. A handful of companies dominate the social media marketplace, submitting them to antitrust legislation could decrease their lobbying power and allow for the introduction of more regulatory oversight.

 

When it comes to social media platforms, we should expect our policymakers to be doing more to ensure right-wing groups do not profit from these sites. YouTube and other fellow tech giants ought to be subject to greater scrutiny, accountability, and transparency when it comes to the ways in which it applies its user policies. Along with reviewing and broadening the conversation as to what borderline content is permissible online, lawmakers need to consider if these companies have become too big for their own and society’s own good.

 

In the meantime, alt-right personalities will continue to profit from the cracks in YouTube’s system. In what the United Nations has called a global groundswell of hate speech, chances are they are not slowing down anytime soon. 

 

 

If you like this article, please consider becoming a Patron and contributing to the work we do here at The Mantle.

 

 

Alt-Right, The Right, YouTube, Social Media