EPSRC logo

Details of Grant 

EPSRC Reference: EP/V010875/1
Title: LISI - Learning to Imitate Nonverbal Communication Dynamics for Human-Robot Social Interaction
Principal Investigator: Celiktutan Dikici, Dr O
Other Investigators:
Researcher Co-Investigators:
Project Partners:
SoftBank Robotics
Department: Engineering
Organisation: Kings College London
Scheme: New Investigator Award
Starts: 01 May 2021 Ends: 31 October 2023 Value (£): 284,291
EPSRC Research Topic Classifications:
Artificial Intelligence Human Communication in ICT
Human-Computer Interactions Image & Vision Computing
EPSRC Industrial Sector Classifications:
Healthcare Information Technologies
Education
Related Grants:
Panel History:
Panel DatePanel NameOutcome
03 Aug 2020 EPSRC ICT Prioritisation Panel August 2020 Announced
Summary on Grant Application Form
We are approaching a future where robots will progressively become widespread in many aspects of our daily lives, including education, healthcare, work and personal use. All of these practical applications require that humans and robots work together in human environments, where social interaction is unavoidable. Along with verbal communication, successful social interaction is closely coupled with the interplay between nonverbal perception and action mechanisms, such as observation of one's gaze behaviour and following their attention, coordinating the form and function of hand-arm gestures. Humans perform social interaction in an instinctive and adaptive manner, with no effort. For robots to be successful in our social landscape, they should therefore engage in social interactions in a human-like manner, with increasing levels of autonomy.

Despite the exponential growth in the fields of human-robot interaction and social robotics, the capabilities of current social robots are still limited. First, most of the interaction contexts has been handled through tele-operation, whereby a human operator controls the robot remotely. However, this approach will be labour-intensive and impractical as the robots become more commonplace in our society. Second, designing interaction logic by manually programming each behaviour is exceptionally difficult, taking into account the complexity of the problem. Once fixed, it will be limited, not transferrable to unseen interaction contexts, and not robust to unpredicted inputs from the robot's environment (e.g., sensor noise).

Data-driven approaches are a promising path for addressing these shortcomings as modelling human-human interaction is the most natural guide to designing human-robot interaction interfaces that can be usable and understandable by everyone. This project aims (1) to develop novel methods for learning the principles of human-human interaction autonomously from data and learning to imitate these principles via robots using the techniques of computer vision and machine learning, and (2) to synergistically integrate these methods into the perception and control of real humanoid robots. This project will set the basis for the next generation of robots that will be able to learn how to act in concert with humans by watching human-human interaction videos.

Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: