EPSRC logo

Details of Grant 

EPSRC Reference: EP/S027238/1
Title: PLEAD: Provenance-driven and Legally-grounded Explanations for Automated Decisions
Principal Investigator: Moreau, Professor L
Other Investigators:
Researcher Co-Investigators:
Project Partners:
Experian Roke Manor Research Ltd Southampton City Council
University of Aberdeen
Department: Informatics
Organisation: Kings College London
Scheme: Standard Research
Starts: 07 October 2019 Ends: 31 March 2022 Value (£): 347,636
EPSRC Research Topic Classifications:
Artificial Intelligence Information & Knowledge Mgmt
EPSRC Industrial Sector Classifications:
Financial Services Information Technologies
Related Grants:
EP/S027254/1
Panel History:
Panel DatePanel NameOutcome
12 Feb 2019 Digital Economy Investigator-led Research Projects 12 February 2019 Announced
Summary on Grant Application Form
Algorithms and Artificial Intelligence play a key role nowadays in many technological systems that control or affect various aspects of our lives. They optimise our driving routes every day according to traffic conditions; they decide whether our mortgage applications get approved; they even recommend us potential life partners. They work silently behind the scene without much of our notice, until they do not. Few of us would probably think much about it when our credit card application is approved in two seconds. Only when it is rejected, do we start to question the decision. Most of the time, the answers we get are not satisfactory, if we get any at all. The spread of such opaque automated decision-making in daily life has been driving the public demand for algorithmic accountability - the obligation to explain and justify automated decisions. The main concern is that it is not right for those algorithms, effectively black boxes, to take in our data and to make decisions affecting us in ways we do not understand. For this reason, the General Data Protection Regulation requires that we, as data subjects, be provided with "meaningful information about the logic involved, as well as the significance and the envisaged consequences of such processing." Likewise, consumers should be treated fairly when receiving financial services as per financial services regulations and algorithms should be free of discrimination as per data protection, equality and human rights laws. However, as laws and regulations do not prescribe how to meet such requirements, businesses are left with having to interpret those themselves, employing a variety of means, including reports, interactive websites, or even dedicated call centres, to provide explanations to their customers.

Against this background, provenance, and specifically its standard PROV, describes how a piece of information or data was created and what influenced its production. Within recorded provenance trails, we can retrace automated decisions to provide answers to some questions, such as what data were used to support a decision, who or what organisation was responsible for the data, who else might have been impacted. While provenance information is structurally simple, provenance captured from automated systems, however, tends to be overwhelming for human consumption. In addition, simply making provenance available to a person does not necessarily constitute an explanation. It would need to be summarised and its essence extracted to be able to construct an explanation addressing a specific regulatory purpose. How we do this is unknown today.

PLEAD brings together an interdisciplinary team of technologists, legal experts, commercial companies and public organisations to investigate how provenance can help explain the logic that underlies automated decision-making to the benefit of data subjects as well as help data controllers to demonstrate compliance with the law. In particular, we will identify various types of meaningful explanations for algorithmic decisions in relation to their purposes, categorise them against the legal requirements applicable to UK businesses relating to data protection, discrimination and financial services. Building on those, we will conceive explanation-generating algorithms that process, summarise and abstract provenance logged by automated decision-making pipelines. An Explanation Assistant tool will be created for data controllers to provision their applications with provenance-based explanations capabilities. Throughout the project, we will engage with partners, data subjects, data controllers, and regulators via interviews and user studies to ensure the explanations are fit for purpose and meaningful. As a result, explanations that are provenance-driven and legally-grounded will allow data subjects to place their trust in automated decisions, and will allow data controllers to ensure compliance with legal requirements placed on their organisations.
Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: