EPSRC logo

Details of Grant 

EPSRC Reference: EP/X030156/1
Title: Human-machine learning of ambiguities to support safe, effective, and legal decision making
Principal Investigator: Tamaddoni-Nezhad, Dr A
Other Investigators:
Hunter, Dr A
Researcher Co-Investigators:
Project Partners:
BMT Group Ltd (UK) Metropolitan Police Service National Physical Laboratory
Thales Ltd UKRI TAS Node in Resilience
Department: Computing Science
Organisation: University of Surrey
Scheme: Standard Research
Starts: 15 June 2023 Ends: 14 June 2026 Value (£): 887,961
EPSRC Research Topic Classifications:
Artificial Intelligence Control Engineering
Criminology Robotics & Autonomy
EPSRC Industrial Sector Classifications:
Aerospace, Defence and Marine
Related Grants:
Panel History:
Panel DatePanel NameOutcome
22 Jun 3000 National Security Sandpit 1 Full Proposal Announced
Summary on Grant Application Form
Mobile autonomous robots offer huge potential to help humans and reduce risk to life in a variety of potentially dangerous defence and security (as well as civilian) applications. However, there is an acute lack of trust in robot autonomy in the real world - in terms of operational performance, adherence to the rules of law and safety, and human values. Furthermore, poor transparency and lack of explainability (particularly with popular deep learning methods) add to the mistrust when autonomous decisions do not align with human "common sense". All of these factors are preventing the adoption of autonomous robots and causing a barrier to the future vision of seamless human-robot cooperation. The crux of the problem is that autonomous robots do not perform well under the many types of ambiguity that arise commonly in the real world. These can be caused by inadequate sensing information or conflicting objectives of performance, safety, and legality. On the other hand, humans are very good at recognising and resolving these ambiguities.

This project aims to imbue autonomous robots with a human-like ability to handle real-world ambiguities. This will be achieved through the logical and probabilistic machine learning approach of Bayesian meta-interpretive learning (BMIL). In simple terms, this approach uses a set of logical statements (i.e., propositions, connectives, etc.) that are akin to human language. In contrast, the popular approach of deep learning uses complex multi-layered neural networks with millions of numerical connections. It is through the logical reprsentation and human-like reasoning of BMIL that it will be possible to encode expert human knowledge into the perceptive "world model" and deliberative "planner" of the robot's "artificial brain". The human-like decision-making will be encoded in a variety of ways: A) By design from operational and legal experts in the form of initial logical rules; B) Through passive learning of new logical representations and rules during intervention by human overrides when the robot is not behaving as expected; and C) Through recognising ambiguities before they arise and active learning of rules to resolve them with human assistance.

A general autonomy framework will be developed to incorporate the new approach. It is intended that this will be applicable to all forms of autonomous robots in all applications. However, as a credible and feasible case study, we are focusing our real-world experiments on aquatic applications using an uncrewed surface vehicle (USV) or "robot boat" with underwater acoustic sensors (sonar) for searching underwater spaces. This problem is relevant in several areas of defence and security, including water gap crossing, naval mine countermeasures, and anti-submarine warfare. Specifically, our application focus will be on the police underwater search problem, which has challenging operational goals (i.e., finding small and potentially concealed objects underwater and amidst clutter), as well as considerations for the safety of the human divers and other users of the waterway (e.g., akin to the International Regulations for Preventing Collisions at Sea), and legal obligations relating to preservation of the evidence chain and timeliness due to custodial constraints.

Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.surrey.ac.uk