EPSRC logo

Details of Grant 

EPSRC Reference: EP/X028569/1
Title: Satisficing Trust in Human Robot Teams
Principal Investigator: Baber, Professor C
Other Investigators:
Musolesi, Professor M Hunt, Dr ER Waterson, Professor PE
Milivojevic, Dr S
Researcher Co-Investigators:
Project Partners:
Department: School of Computer Science
Organisation: University of Birmingham
Scheme: Standard Research
Starts: 01 June 2023 Ends: 31 May 2026 Value (£): 1,218,852
EPSRC Research Topic Classifications:
Artificial Intelligence Control Engineering
Intelligent & Expert Systems Robotics & Autonomy
EPSRC Industrial Sector Classifications:
Aerospace, Defence and Marine
Related Grants:
Panel History:
Panel DatePanel NameOutcome
22 Jun 3000 National Security Sandpit 1 Full Proposal Announced
Summary on Grant Application Form
In this project, we design and develop Human-Robot Teams (using experiments with real robots and modelling with Reinforcement Learning) to conduct urban search and related activity. A team will consist of 1-3 human operators and 2-6 robots. We extend the definition of a 'team' beyond robots and humans on the ground. Drawing an analogy with the management of major incidents (in UK Emergency Services), operational activity is performed at the 'bronze' level, i.e., by the local human-robot team which is overseen by tactical coordinators at the 'silver' level, e.g., providing guidance on legal or other constraints, and which answers to high-level strategic command at 'gold' level, e.g., redefining goals for the mission etc. In this way the 'team' is more than local coordination and trust applies through the command hierarchy as well as horizontally across each level. Communication may be intermittent, and the mission's goals and constraints might change during the mission. This is a further driver of variation in trust, along with mission, activity, situation etc. Each team member, human or robot, will be allocated tasks within the team and perform these in an autonomous manner. Key to team performance will be the ability to acquire and maintain Distributed Situation Awareness, i.e., team members will have their own interpretation of the situation as they see it, and their own interpretation of the behaviour of their teammates. Teammate behaviour could be inferred from observation of what teammates are doing in a given situation, and whether this is to be expected. This creates behavioural markers of trust. We also consider the confidence with which teammates might express the Situation Awareness, e.g., in terms of their interpretation of the data they perceive in the situation. From the interpretation of teammate behaviour, we explore appropriately scaled trust (using the concept of a 'ladder of trust' on which trust moves up and down depending on the quality of situation awareness, the behaviour of teammates, the threat posed by the situation). From the Distributed Situation Awareness, we also explore counter-factual ('what-if') reasoning to cope with uncertain and ambiguous situations (where ambiguity might relate to permissions and rights to perform tasks, or to the consequences of an action, as well as Situation Awareness).
Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.bham.ac.uk