Projects per year
Abstract
Artificial intelligence decision making can cause discriminatory harm to many vulnerable groups. Redress is often suggested through increased transparency of these systems. But for what group are we implementing it? This article seeks to identify what transparency means for technical, legislative, and public realities and stakeholders.
Original language | English |
---|---|
Pages (from-to) | 36-44 |
Number of pages | 9 |
Journal | COMPUTER |
Volume | 53 |
Issue number | 11 |
DOIs | |
Publication status | Published - Nov 2020 |
Fingerprint
Dive into the research topics of 'Transparency for Whom? Assessing Discriminatory Artificial Intelligence'. Together they form a unique fingerprint.Projects
- 1 Finished
-
DADD: Discovering and Attesting Digital Discrimination
Such, J. (Primary Investigator), Tasioulas, J. (Co-Investigator), Nelken, D. (Co-Investigator), Viganò, L. (Co-Investigator), Criado Pacheco, N. (Co-Investigator), Hedges, M. (Co-Investigator) & Coté, M. (Co-Investigator)
EPSRC Engineering and Physical Sciences Research Council
1/08/2018 → 31/05/2022
Project: Research