Investigating the Legality of Bias Mitigation Methods in the United Kingdom

MacKenzie Jorgensen*, Madeleine Waller, Oana Cocarascu, Natalia Criado, Odinaldo Rodrigues, Jose Such, Elizabeth Black

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

52 Downloads (Pure)

Abstract

Algorithmic Decision-Making Systems (ADMS) 1 fairness issues have been well highlighted over the past decade [1] , including some facial recognition systems struggling to identify people of color [2] . In 2021, Uber drivers filed a claim with the U.K. ’s employment tribunal for unfair dismissal resulting from automated facial recognition technology by Microsoft [3] . Bias mitigation methods have been developed to reduce discrimination from ADMS. These typically operationalize fairness notions as fairness metrics to minimize discrimination [4] . We refer to ADMS to which bias mitigation methods have been applied as “mitigated ADMS” or, in the singular, a “mitigated system.”
Original languageEnglish
Pages (from-to)87-94
JournalIEEE TECHNOLOGY AND SOCIETY MAGAZINE
Volume42
Issue number4
DOIs
Publication statusPublished - Dec 2023

Fingerprint

Dive into the research topics of 'Investigating the Legality of Bias Mitigation Methods in the United Kingdom'. Together they form a unique fingerprint.

Cite this