EPSRC logo

Details of Grant 

EPSRC Reference: EP/V026518/1
Title: UKRI Trustworthy Autonomous Systems Node in Functionality
Principal Investigator: Windsor, Dr S
Other Investigators:
Eder, Professor K Downer, Dr J Ives, Professor JCS
Rossiter, Professor JM Hauert, Dr S
Researcher Co-Investigators:
Project Partners:
BT Burges Salmon LLP Defence Science & Tech Lab DSTL
Foster and Partners Liverpool Data Research Associate LDRA LV=GI
Martyn Thomas Associates Limited Ocado Technology Olympus Surgical Technologies Europe
Prof Simon Gregory ROVCO LIMITED Thales Ltd
Toshiba Europe Limited (UK)
Department: Aerospace Engineering
Organisation: University of Bristol
Scheme: Standard Research
Starts: 01 November 2020 Ends: 30 April 2024 Value (£): 3,315,004
EPSRC Research Topic Classifications:
Artificial Intelligence Control Engineering
Ethics Robotics & Autonomy
Software Engineering
EPSRC Industrial Sector Classifications:
Transport Systems and Vehicles
Related Grants:
Panel History:
Panel DatePanel NameOutcome
14 Sep 2020 Trustworthy Autonomous System Nodes Interview Panel A Announced
Summary on Grant Application Form
'Autonomous systems' are machines with some form of decision-making ability, which allows them to act independently from a human controller. This kind of technology is already all around us, from traction control systems in cars, to the helpful assistant in mobile phones and computers (Siri, Alexa, Cortana). Some of these systems have more autonomy than others, meaning that some are very predictable and will only react in the way they are initially set up, whereas others have more freedom and can learn and react in ways that go beyond their initial setup. This can make them more useful, but also less predictable.

Some autonomous systems have the potential to change what they do, and we call this 'evolving functionality'. This means that a system designed to do a certain task in a certain way, may 'evolve' over time to either do the same task a different way, or to do a different task. All without a human controller telling it what to do. These kinds of systems are being developed because they are potentially very useful, with a wide range of possible applications ranging from minimal down-time manufacturing through to emergency response and robotic surgery. The ability to evolve in functionality offers the potential for autonomous systems to move from conducting well defined tasks in predictable situations, to undertaking complex tasks in changing real-world environments.

However, systems that can evolve in function lead to legitimate concerns about safety, responsibility and trust. We learn to trust technology because it is reliable, and when a technology is not reliable, we discard it because it cannot be trusted to function properly. But it may be difficult to learn to trust technology whose function is changing. We might also ask important questions about how functional evolutions are monitored, tested and regulated for safety in appropriate ways. For example, just because a robot with the ability to adapt to handle different shaped objects passes safety testing in a warehouse does not mean that it will necessarily be safe if it is used to do a similar task in a surgical setting. It is also unclear who, if anyone, bears the responsibility for the outcome of functional evolution - whether positive or negative.

This research seeks to explore and address these issues, by asking how we can, or should, place trust in autonomous systems with evolving functionality. Our approach is to use three evolving technologies - swarm systems, soft robotics and unmanned air vehicles - which operate in fundamentally different ways, to allow our findings to be used across a wide range of different application areas. We will study these systems in real time to explore both how these systems are developed and how features can be built into the design process to increase trustworthiness, termed Design-for-Trustworthiness. This will support the development of autonomous systems with the ability to adapt, evolve and improve, but with the reassurance that these systems have been developed with methods that ensure they are safe, reliable, and trustworthy.

Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.bris.ac.uk