EPSRC logo

Details of Grant 

EPSRC Reference: EP/Y030540/1
Title: UKRI AI Centre for Doctoral Training in Lifelong Safety Assurance of AI-enabled Autonomous Systems (SAINTS)
Principal Investigator: Habli, Professor I
Other Investigators:
Chubb, Dr J Stoneham, Professor TWC Wilson, Professor R
Porter, Dr ZZ Paterson, Dr C Iglesias Urrutia, Professor CP
Iacovides, Dr I McDermid, Professor JA Morgan, Dr P
MacIntosh, Dr AC
Researcher Co-Investigators:
Project Partners:
Advai Ltd BAE Systems Bradford Teaching Hosp NHS Found Trust
British Standards Institution BSI BT Cambridge Consultants Ltd
Craft Prospect Ltd DAC Beachcroft LLP Fraunhofer IESE
Fraunhofer Institute Cognitive Systems Healthcare Financial Management Assoc Healthcare Safety Investigation Branch
Horiba Mira Ltd Inst of Ergonomics and Human Factors Jaguar Land Rover Limited
Joan & Irwin Jacobs Scholarship Prog. Lloyd's Register Foundation Lloyd's Register Group
MathWorks MBDA Medicines & Healthcare pdts Reg Acy MHRA
NATS Ltd Office for Product Safety and Standards Omnicom Balfour Beatty
Oxa Autonomy Ltd Paige AI PathLAKE Centre
QinetiQ Thales Ltd Thatcham Motor Insurance Repair Res Ctr
The Safety-Critical Systems Club Tokio Marine Kiln Ufonia
Welsh Ambulance Services NHS Trust Wolfram Research Europe Ltd
Department: Computer Science
Organisation: University of York
Scheme: Centre for Doctoral Training
Starts: 01 April 2024 Ends: 30 September 2032 Value (£): 8,050,129
EPSRC Research Topic Classifications:
Artificial Intelligence Human-Computer Interactions
Modelling & simul. of IT sys. Robotics & Autonomy
EPSRC Industrial Sector Classifications:
Aerospace, Defence and Marine Manufacturing
Healthcare Information Technologies
Related Grants:
Panel History:
Panel DatePanel NameOutcome
06 Sep 2023 UKRI CDTs in Artificial Intelligence 2023 expert panel Announced
20 Sep 2023 UKRI CDTs in Artificial Intelligence Interview Panel D Announced
Summary on Grant Application Form
Artificial Intelligence (AI) is a multi-billion pound industry, which is estimated to deliver a 10% increase in UK GDP in 2030, and which has the potential to revolutionise how we live, work and behave.

AI is enabling a wide range of autonomous capabilities, from machine perception for remote inspection, to planning and decision-making in driverless cars, to routine robotic surgery. However, the domains and contexts where the impact of these technologies could be most profound, including the NHS, transport and national security, are dynamic and evolving, where AI system failures can cause significant harm. The UK Government is clear that safety must be paramount if the benefits of adopting AI technologies are to be unlocked. As AI and its applications continue to advance, society urgently needs professionals with the skills and knowledge to assure the safety of AI-enabled autonomous systems (AI-AS) in their real-world contexts of use.

The UKRI Centre for Doctoral Training (CDT) in Lifelong Safety Assurance of AI-enabled Autonomous Systems (SAINTS) will train 60 highly-skilled professionals, from diverse disciplines (Computer Science, Philosophy, Law, Sociology and Economics), to advance the safety assurance of AI-AS. Safety assurance concerns the actions, arguments and evidence which justify confidence that systems are acceptably safe in their operating contexts. The University of York is the world-leader in this field, and has been pioneering work in assurance of AI-AS safety, working across the disciplinary spectrum. Research within SAINTS will address the following two research themes:

(1) Lifelong safety of AI-AS: safety-driven design and training for evolving contexts; testing for open and uncertain operating environments; safe retraining and continual learning; proactive monitoring procedures and dynamic safety cases; ongoing assurance of societal and ethical acceptability.

(2) Safety of increasingly autonomous AI-AS: understanding Human-AI interaction to design safe joint cognitive systems; the assurance of safe transition between human and AI-AS control; achieving effective human oversight and AI-AS explainability; preserving human autonomy and responsibility.

Safety assurance is an inherently multidisciplinary field. Whatever their disciplinary background, all future leaders in AI-AS safety will need to understand, and work within, the wider technical, ethical, legal and societal context into which the systems are deployed. The CDT training programme therefore includes expert teaching in AI, safety, philosophical ethics, law and sociology, underpinned by ongoing training in Responsible AI.

Students will pursue their doctoral research within multidisciplinary research teams, which focus on 'grand challenges' that align with the CDT's two research themes, such as the safety of human-AI teaming and the safety of AI-enabled mobile autonomous systems. Work will be grounded in use cases, co-designed with the CDT's industrial, regulatory and public sector partners, and with involvement from members of a Public Panel, to ensure the research is developed responsibly and is responsive to stakeholder needs. This will equip students with the capacities and skills to move seamlessly from doctoral research to AI-AS safety roles in industry, regulation, and the public sector, as well as to postdoctoral fellowships. This cohort-based approach is a step change in AI-AS safety. By enabling peer-to-peer learning across disciplines and with external partners, it will build resilience in the evidence base for AI-AS safety.

SAINTS will be located in the flagship Institute for Safe Autonomy at the University of York, the world's first facility dedicated to safe autonomy. The CDT is ideally placed to create and sustain the next generation of experts and a lasting community of professionals who will pioneer a new generation of evidence-based policy and practices for safe AI-AS.

Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.york.ac.uk