King's College London

Research portal

Addressing Regulatory Requirements on Explanations for Automated Decisions with Provenance: A Case Study

Research output: Contribution to journalArticlepeer-review

Trung Dong Huynh, Niko Tsakalakis, Ayah Helal, Sophie Stalla-Bourdillon, Luc Moreau

Original languageEnglish
Article number16e
Number of pages14
JournalDigital Government: Research and Practice
Issue number2
Accepted/In press15 Nov 2020
Published20 Jan 2021


King's Authors


AI-based automated decisions are increasingly used as part of new services being deployed to the general public. This approach to building services presents significant potential benefits, such as the reduced speed of execution, increased accuracy, lower cost, and ability to adapt to a wide variety of situations. However, equally significant concerns have been raised and are now well documented such as concerns about privacy, fairness, bias and ethics. On the consumer side, more often than not, the users of those services are provided with no or inadequate explanations for decisions that may impact their lives.
In this paper, we report the experience of developing a socio-technical approach to constructing explanations for such decisions from their audit trails, or provenance, in an automated manner. The work has been carried out in collaboration with the UK Information Commissioner's Office (ICO). In particular, we have implemented an automated Loan Decision scenario, instrumented its decision pipeline to record provenance, categorized relevant explanations according to their audience and their regulatory purposes, built an explanation-generation prototype, and deployed the whole system in an online demonstrator.

Download statistics

No data available

View graph of relations

© 2020 King's College London | Strand | London WC2R 2LS | England | United Kingdom | Tel +44 (0)20 7836 5454