‘Ableism And Disability Discrimination In New Surveillance Technologies’

As the use of algorithmic technologies continues to pervade every area of our life from the monitoring of delivery drivers dropping off your latest purchase, to health apps tracking your diet and exercise to students and workers being judged and discriminated against by automated decision makers – AI technologies are here to stay.

In the CDT report ‘Ableism And Disability Discrimination In New Surveillance Technologies’- the authors show how “disabled people had their lives made more difficult, or were otherwise harmed because a poorly-trained algorithm or artificial intelligence system did not incorporate the needs of disabled people.”

The report from DEAI member Centre for Democracy and Technology, examines 4 areas where algorithmic and/or surveillance technologies are used to surveil, control, discipline, and punish people, with particularly harmful impacts on
disabled people including education and the workplace.

Prepare to be startled by the accounts of facial recognition systems that could not recognise or interpret the faces of people with albinism of tumors or of disabled students accused of cheating because their behaviour such as needing to use screen readers or dictation software was flagged as prohibited by an automated proctoring service.

Read the full report including recommendations for regulators here.