Recommender systems and the amplification of extremist content

Joe Whittaker, Sean Looney, Alastair Reed, Fabio Votta

Research output: Contribution to journalArticlepeer-review

63 Citations (Scopus)
5 Downloads (Pure)

Abstract

Policymakers have recently expressed concerns over the role of recommendation
algorithms and their role in forming “filter bubbles”. This is a particularly prescient concern in the context of extremist content online; these algorithms may promote extremist content at the expense of more moderate voices. In this article, we make two contributions to this debate. Firstly, we provide a novel empirical analysis of three platforms’ recommendation systems when interacting with far-right content. We find that one platform—YouTube—does amplify extreme and fringe content, while two—Reddit and Gab—do not. Secondly, we contextualise these findings into the regulatory debate. There are currently few policy instruments for dealing with algorithmic amplification, and those that do exist largely focus on transparency. We argue that policymakers have yet to fully understand the problems inherent in “de-amplifying” legal, borderline content and argue that a co-regulatory approach may offer a route towards tackling many of these challenges.
Original languageEnglish
Number of pages29
JournalInternet Policy Review
Volume10
Issue number2
DOIs
Publication statusPublished - 30 Jun 2021
Externally publishedYes

Bibliographical note

Open access CC-BY

Keywords

  • Algorithms
  • Extremism
  • Filter bubble
  • Online radicalisation
  • Regulation

ASJC Scopus subject areas

  • Communication
  • Computer Networks and Communications
  • Management, Monitoring, Policy and Law

Fingerprint

Dive into the research topics of 'Recommender systems and the amplification of extremist content'. Together they form a unique fingerprint.

Cite this