TY - JOUR
T1 - Investigating the Legality of Bias Mitigation Methods in the United Kingdom
AU - Jorgensen, MacKenzie
AU - Waller, Madeleine
AU - Cocarascu, Oana
AU - Criado, Natalia
AU - Rodrigues, Odinaldo
AU - Such, Jose
AU - Black, Elizabeth
PY - 2023/12
Y1 - 2023/12
N2 - Algorithmic Decision-Making Systems (ADMS) 1 fairness issues have been well highlighted over the past decade [1] , including some facial recognition systems struggling to identify people of color [2] . In 2021, Uber drivers filed a claim with the U.K. ’s employment tribunal for unfair dismissal resulting from automated facial recognition technology by Microsoft [3] . Bias mitigation methods have been developed to reduce discrimination from ADMS. These typically operationalize fairness notions as fairness metrics to minimize discrimination [4] . We refer to ADMS to which bias mitigation methods have been applied as “mitigated ADMS” or, in the singular, a “mitigated system.”
AB - Algorithmic Decision-Making Systems (ADMS) 1 fairness issues have been well highlighted over the past decade [1] , including some facial recognition systems struggling to identify people of color [2] . In 2021, Uber drivers filed a claim with the U.K. ’s employment tribunal for unfair dismissal resulting from automated facial recognition technology by Microsoft [3] . Bias mitigation methods have been developed to reduce discrimination from ADMS. These typically operationalize fairness notions as fairness metrics to minimize discrimination [4] . We refer to ADMS to which bias mitigation methods have been applied as “mitigated ADMS” or, in the singular, a “mitigated system.”
UR - https://ieeexplore.ieee.org/document/10410096
U2 - 10.1109/MTS.2023.3341465
DO - 10.1109/MTS.2023.3341465
M3 - Article
SN - 0278-0097
VL - 42
SP - 87
EP - 94
JO - IEEE TECHNOLOGY AND SOCIETY MAGAZINE
JF - IEEE TECHNOLOGY AND SOCIETY MAGAZINE
IS - 4
ER -