EPSRC logo

Details of Grant 

EPSRC Reference: EP/V025279/1
Title: Turing AI Fellowship: Trustworthy Machine Learning
Principal Investigator: Weller, Dr A
Other Investigators:
Researcher Co-Investigators:
Project Partners:
Chelsea & Westminster Hosp NHS Fdn Trust Clifford Chance LLP (UK) DeepMind
GNS Healthcare Massachusetts Institute of Technology Max Planck Institutes
Simmons Wavelength Limited University of Alicante
Department: Engineering
Organisation: University of Cambridge
Scheme: EPSRC Fellowship - NHFP
Starts: 01 January 2021 Ends: 31 December 2025 Value (£): 1,283,429
EPSRC Research Topic Classifications:
Artificial Intelligence
EPSRC Industrial Sector Classifications:
Healthcare
Related Grants:
Panel History:
Panel DatePanel NameOutcome
06 Oct 2020 Turing AI Acceleration Fellowship Interview Panel C Announced
Summary on Grant Application Form
Machine learning (ML) systems are increasingly being deployed across society, in ways that affect many lives. We must ensure that there are good reasons for us to trust their use. That is, as Baroness Onora O'Neill has said, we should aim for reliable measures of trustworthiness. Three key measures are:

Fairness - measuring and mitigating undesirable bias against individuals or subgroups;

Transparency/interpretability/explainability - improving our understanding of how ML systems work in real-world applications; and

Robustness - aiming for reliably good performance even when a system encounters different settings from those in which it was trained.

This fellowship will advance work on key technical underpinnings of fairness, transparency and robustness of ML systems, and develop timely key applications which work at scale in real world health and criminal justice settings, focusing on interpretability and robustness of medical imaging diagnosis systems, and criminal recidivism prediction. The project will connect with industry, social scientists, ethicists, lawyers, policy makers, stakeholders and the broader public, aiming for two-way engagement - to listen carefully to needs and concerns in order to build the right tools, and in turn to inform policy, users and the public in order to maximise beneficial impacts for society.

This work is of key national importance for the core UK strategy of being a world leader in safe and ethical AI. As the Prime Minister said in his first speech to the UN, "Can these algorithms be trusted with our lives and our hopes?" If we get this right, we will help ensure fair, transparent benefits across society while protecting citizens from harm, and avoid the potential for a public backlash against AI developments. Without trustworthiness, people will have reason to be afraid of new ML technologies, presenting a barrier to responsible innovation. Trustworthiness removes frictions preventing people from embracing new systems, with great potential to spur economic growth and prosperity in the UK, while delivering equitable benefits for society. Trustworthy ML is a key component of Responsible AI - just announced as one of four key themes of the new Global Partnership on AI.

Further, this work is needed urgently - ML systems are already being deployed in ways which impact many lives. In particular, healthcare and criminal justice are crucial areas with timely potential to benefit from new technology to improve outcomes, consistency and efficiency, yet there are important ethical concerns which this work will address. The current Covid-19 pandemic, and the Black Lives Matter movement, indicate the urgency of these pressing issues.

Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.cam.ac.uk