The Christchurch Attacks: Youtube As an Intermediary for Extremism.

icanhearyoublogging
4 min readJan 14, 2021
Credit: Urban Dictionary

The second public concern that Kaye (2019) discusses in relation to online content moderation is terrorism.

As discussed in my previous post platform owners have too much power over the regulation of online content, and they don’t always get it right.

Kaye (2019) makes an example of the censoring of public information in Syria. In 2014, activist Hadi al-Khatib started curating an archive of footage uploaded to Youtube by the Syrian public, documenting the war and terrorist activity within Syria — this could later be used for public record, as well as providing insight into the realities of life for Syrian people (Kaye, 2019). Al-Khatib and his team faced issues with their work in 2017, finding Youtube were deleting footage before they were able to download it, based around the platform’s rules regarding the upload of terrorist content (Kaye, 2019). Due to the sheer volume of content uploaded to Youtube, the platform relies increasingly upon algorithms to identify and remove content as quickly as possible — the issue being that the algorithms cannot contextualise the footage (Wille, 2020). What happened to the Syrian Archive was concerning — the public deserved access to the realities of war and terrorism, the Syrian people who took the footage deserved to be seen and heard, yet Youtube’s algorithms deleted countless records of evidence.

Source: Open Knowledge Foundation Deutschland

While Youtube’s algorithms have wrongly censored public information in the name of terrorism, they may have also fallen short in combating real risks.

There is ongoing debate as to whether Youtube’s algorithms can push people in the direction of extremist views (Lopatto, 2020). Some studies have argued against this claim (for further discussion, see Ledwich, & Zaitsev, 2019; Munger, & Phillips, 2020), however others have concluded that this risk is very real. Costello, et al (2016) conducted a survey amongst young people in order to discern what they were being exposed to online, finding that the majority were regularly exposed to negative material used to stereotype groups (for example based upon race or ethnicity), concluding that online behaviours may result in the fostering of extremist attitudes. Similarly, Phadke and Mitra (2020) found Facebook and Twitter to be effective platforms for the broadcast of Anti-Immigration narratives of hate, with Facebook especially offering such groups the opportunity to radicalise and recruit users — while this study is not directly relevant to Youtube, it still provides interesting context into the risks of radicalisation online.

In order to explore the possibility that Youtube’s algorithms can lead to radicalisation, I will be discussing the Christchurch shootings.

Source: BBC News.

On 15th March 2019 a white supremacist attacked two mosques in Christchuch, New Zealand, killing 51 people (Sullivan, 2020). An official report of the attack uncovered that the perpetrator had found Youtube to be a “significant source of information and inspiration”. The attacker had also frequented far-right groups other online platforms, such as Facebook, Reddit and 4Chan, that acted as online forums for anti-muslim and anti-immigration hatred (Lopatto, 2020; Sullivan, 2020). However, as said by New Zealand PM, Jacinda Ardern: “What particularly stood out was the statement that the terrorist made that […] YouTube was a significant source of information and inspiration” (Lopatto, 2020). Alongside government failings, the Christchurch attacks provide a clear example of Youtube acting as an intermediary for extremist content.

References

Costello, M., Hawdon, J., Ratliff, T., & Grantham, T. (2016). Who views online extremism? Individual attributes leading to exposure. Computers in Human Behavior, 63, 311–320.

Kaye, D. (2019). Speech police: The global struggle to govern The Internet. Columbia Global Reports.

Ledwich, M., & Zaitsev, A. (2019). Algorithmic extremism: Examining YouTube’s rabbit hole of radicalization.

Lopatto, E. (2020, December 8). Christchurch shooter was radicalized on YouTube, New Zealand report says. The verge. Retrieved from https://www.theverge.com/2020/12/8/22162779/christchurch-shooter-youtube-mosque-radicalized

Munger, K., & Phillips, J. (2020). Right-Wing YouTube: A Supply and Demand Perspective. The International Journal of Press/Politics.

Phadke, S., & Mitra, T. (2020, April). Many Faced Hate: A Cross Platform Study of Content Framing and Information Sharing by Online Hate Groups. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1–13).

Sullivan, H. (2020, December 8). Christchurch inquiry report released — as it happened. The Guardian. Retieved from https://www.theguardian.com/world/live/2020/dec/08/christchurch-shooting-royal-commission-report-to-be-released-live?page=with:block-5fced4248f08a20e617d9de1#liveblog-navigation

Wille, B. (2020). “Video Unavailable”: Social media platforms remove evidence of war crimes. Report: Human Rights Watch (pp. 1–93). Retrieved from https://www.hrw.org/sites/default/files/media_2020/09/crisis_conflict0920_web_0.pdf

--

--

icanhearyoublogging

Final year Media and Communications undergraduate student at Loughborough University.