EPSRC logo

Details of Grant 

EPSRC Reference: EP/W017466/1
Title: ActivATOR - Active AudiTiOn for Robots
Principal Investigator: Evers, Dr C
Other Investigators:
Researcher Co-Investigators:
Project Partners:
Audio Analytic Ltd (UK) Consequential Robotics Ltd National Oceanography Centre
Uni of Illinois at Urbana Champaign University of Oxford
Department: Sch of Electronics and Computer Sci
Organisation: University of Southampton
Scheme: New Investigator Award
Starts: 03 October 2023 Ends: 02 October 2026 Value (£): 443,263
EPSRC Research Topic Classifications:
Artificial Intelligence Digital Signal Processing
Instrumentation Eng. & Dev. Music & Acoustic Technology
Vision & Senses - ICT appl.
EPSRC Industrial Sector Classifications:
Healthcare
Related Grants:
Panel History:
Panel DatePanel NameOutcome
28 Mar 2022 EPSRC ICT Prioritisation Panel March 2022 Announced
Summary on Grant Application Form
Life in sound occurs in motion. As human listeners, audition - the ability to listen - is shaped by physical interactions between our bodies and the environment. We integrate motion with auditory perception in order to hear better (e.g., by approaching sound sources of interest), to identify objects (e.g., by touching objects and listening to the resulting sound), to detect faults (e.g., by moving objects to listen to anomalous creaks), and to offload thought (e.g., by tapping surfaces to recall musical pieces).

Therefore, the ability to make sense of and exploit sounds in motion is a fundamental prerequisite for embodied Artificial Intelligence (AI). This project will pioneer the underpinning, probabilistic framework for active robot audition that enables embodied agents to control the motion of their own bodies ('ego-motion') for auditory attention in realistic, acoustic environments (households, public spaces, and environments involving multiple, competing sound sources).

By integrating sound with motion, this project will enable machines to imagine, control and leverage the auditory consequences of physical interactions with the environment. By transforming the ways in which machines make sense of life in sound, the research outcomes will be pivotal for new, emerging markets that enable robots to augment, rather than rival, humans in order to surpass the limitations of the human body (sensory accuracy, strength, endurance, memory). Therefore, the proposed research has the potential to transform and disrupt a whole host of industries involving machine listening, ranging from human-robot augmentation (smart prosthetics, assistive listening technology, brain-computer interfaces) to human-robot collaboration (planetary exploration, search-and-rescue, hazardous material removal) and automation (environmental monitoring, autonomous vehicles, AI-assisted diagnosis in healthcare).

This project will consider the specific case study of a collaborative robot ('cobot') that augments the auditory experience of a hearing-impaired human partner. Hearing loss is the second most common disability in the UK, affecting 11M people. The loss of hearing affects situational awareness as well as the ability to communicate, which can impact on mental health and, in extreme cases, cognitive function. Nevertheless, for complex reasons that range from discomfort to social stigma, only 2M people choose to wear hearing aids.

The ambition of this project is to develop a cobot that will augment the auditory experience of a hearing-impaired person. The cobot will move autonomously within the human partner's household to assist with everyday tasks. Our research will enable the cobot to exploit ego-motion in order to learn an internal representation of the acoustic scene (children chattering, kettle boiling, spouse calling for help). The cobot will interface with its partner through an on-person smart device (watch, mobile phone). Using the human-cobot interface, the cobot will alert its partner of salient events (call for help) via vibrating messages, and share its auditory experiences via interactive maps that visualise auditory cues and indicate saliency (e.g., loudness, spontaneity) and valence (positive vs concerning).

In contrast to smart devices, the cobot will have the unique capability to actively attend to and explore uncertain events (thump upstairs), and take action (assist spouse, call ambulance) without the need for permanently installed devices in personal spaces (bathroom, bedroom). Therefore, the project has the potential to transform the lives of people with hearing impairments by enabling long-term independent living, safeguarding privacy, and fostering inclusivity.

Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.soton.ac.uk