Original language | English |
---|
Title of host publication | Data Protection and Privacy |
---|
Subtitle of host publication | Enforcing Rights in a Changing World |
---|
Editors | Dara Hallinan, Ronald Leenes, Paul De Hert |
---|
Publisher | Hart Publishing |
---|
Chapter | 5 |
---|
Pages | 127-156 |
---|
Volume | 14 |
---|
ISBN (Electronic) | 978-1-50995-453-7, 978-1-50995-452-0 |
---|
ISBN (Print) | 978-1-50995-451-3 |
---|
DOIs | |
---|
Published | 21 Jan 2022 |
---|
Event | International Conference on Computers, Privacy and Data Protection - Brussels, Belgium Duration: 27 Jan 2021 → 29 Jan 2021 Conference number: 14 |
---|
Conference | International Conference on Computers, Privacy and Data Protection |
---|
Abbreviated title | CPDP |
---|
Country/Territory | Belgium |
---|
City | Brussels |
---|
Period | 27/01/2021 → 29/01/2021 |
---|
The increasing dependence of decision-making on some level of automation has naturally led to discussions about the trustworthiness of such automation, calls for transparent automated decision making and the emergence of ‘explainable Artificial Intelligence’ (XAI). Although XAI research has produced a number of taxonomies for the explanation of Artificial Intelligence (AI) and Machine Learning (ML) models, the legal debate has so far been mainly focused on whether a ‘right to explanation’ exists in the GDPR. Lately, a growing body of interdisciplinary literature is concentrating on the goals and substance of explanations produced for automated decision making, with a view to clarify their role and improve their value against unfairness, discrimination and opacity for the purposes of ensuring compliance with Article 22 of the GDPR. At the same time, several researchers have warned that transparency of the algorithmic processes in itself is not enough and tools for better and easier assessment and review of the whole socio-technical system that includes automated decision-making are needed. In this chapter, we suggest that generating computed explanations would be useful for most of the obligations set forth by the GDPR and can assist towards a holistic compliance strategy when used as detective controls. Computing explanations to support the detection of data protection breaches facilitates the monitoring and auditing of automated decision-making pipelines. Carefully constructed explanations can empower both the data controller and external recipients such as data subjects and regulators and should be seen as key controls in order to meet accountability and data protection-by-design obligations. To illustrate this claim, this chapter presents the work undertaken by the Provenance-driven and Legally-grounded Explanations for Automated Decisions (PLEAD) project towards ‘explainable-by-design’ socio-technical systems. PLEAD acknowledges the dual function of explanations as internal detective controls (to benefit data controllers) and external detective controls (to benefit data subjects) and leverages provenance-based technology to compute explanations and support the deployment of systematic compliance strategies.