What is algorithmic bias?

Algorithms are ‘a set of step-by-step instructions that computers follow to perform a task.’[1] Algorithms are ‘trained’ by drawing on certain sets of data (known as ‘training data’) to ‘learn’ what output is correct for certain people. The algorithm can then apply this model to other people. 

Algorithms are ultimately not value-neutral, colour-blind or gender-blind. Bias, exclusion, and ‘data-centric oppression’ can arise due to unrepresentative or incomplete training data, and reliance on the programmer’s unconscious prejudices.[2]

Algorithms have the capability to shape individuals’ decisions without them even knowing it, giving those who have control of the algorithms an unfair position of power.[3]

The harms of new technology will be most experienced by those already disadvantaged in society…effectively penalis[ing] people based on race and similar issues. – Justin Reich, ED at MIT Teaching Systems Lab[4]

 Algorithmic bias comes from historical human biases amongst programmers, incomplete and unrepresentative training data, and a failure to take into account context when seeking to detect biases. It impacts artists livelihoods as arts and culture increasingly moves online to reach audiences. 

If the arts and culture industries remain fixed in their silo, it will continue to be the global technology companies who accelerate the distribution of the plethora of digital arts and cultural content coming down the product development pipeline. It will be the digital platforms that adapt to expand delivery channels, broaden markets, reach audience and own the data generated. – Leanne de Souza[5]

Suggestions to address algorithmic bias and the impacts on society include:

  • Anti-discrimination laws be updated to include the unfair impacts of algorithmic bias

  • Public education to develop ‘algorithmic literacy’

  • Government and civil society to hold algorithmic operators accountable for bias and encourage algorithmic transparency

  • National governments to enforce local content quotas on digital platforms and work with the arts industry and tech companies to ensure national culture is not unintentionally homogenised


[1] N. Turner Lee, P. Resnick and G. Barton (2019) Algorithmic bias detection and mitigation: best practices and policies to reduce consumer harms, Brookings Institute.

[2] L. Rainie and J. Anderson (2017) Code-Dependent: Pros and Cons of the Algorithm Age, Pew Research Center.

[3] As cited in L. Rainie and J. Anderson (2017) op. cit.

[4] Ibid.

[5] L. de Souza (2020) What next? The convergence of arts, culture and distribution technology,’  Medium.