EPSRC logo

Details of Grant 

EPSRC Reference: EP/N014278/1
Title: ACE-LP: Augmenting Communication using Environmental Data to drive Language Prediction.
Principal Investigator: Waller, Professor A
Other Investigators:
Zhang, Dr J McKenna, Professor SJ Kristensson, Professor P
Researcher Co-Investigators:
Mr R Black
Project Partners:
Arria NLG ltd (UK) Capability Scotland Communication Matters
Edesix Ltd National Museums of Scotland Ninewells Hospital & Medical School
Scope Sensory Software International Ltd Tobii Dynavox
Department: Computing
Organisation: University of Dundee
Scheme: Standard Research
Starts: 01 February 2016 Ends: 31 January 2020 Value (£): 1,007,562
EPSRC Research Topic Classifications:
Artificial Intelligence Comput./Corpus Linguistics
Human-Computer Interactions
EPSRC Industrial Sector Classifications:
Healthcare
Related Grants:
Panel History:
Panel DatePanel NameOutcome
04 Sep 2015 User Interaction with ICT Panel - Full Proposals Announced
Summary on Grant Application Form
Communication is the essence of life. We communicate in many ways, but it is our ability to speak which enables us to chat in every-day situations. An estimated quarter of a million people in the UK alone are unable to speak and are at risk of isolation. They depend on Voice Output Communication Aids (VOCAs) to compensate for their disability. However, the current state of the art VOCAs are only able to produce computerised speech at an insufficient rate of 8 to 10 words per minute (wpm). For some users who are unable to use a keyboard, rates are even slower. For example, Professor Stephen Hawking recently doubled his spoken communication rate to 2 wpm by incorporating a more efficient word prediction system and common shortcuts into his VOCA software. Despite three decades of developing VOCAs, face-to-face communication rates remain prohibitively slow. Users seldom go beyond basic needs based utterances as rates remain, at best, 10 times slower than natural speech. Compared to the average of 150-190 wpm for typical speech, aided communication rates make conversation almost impossible.

ACE-LP brings together research expertise in Augmentative and Alternative Communication (AAC) (University of Dundee), Intelligent Interactive Systems (University of Cambridge), and Computer Vision and Image Processing (University of Dundee) to develop a predictive AAC system that will address these prohibitively slow communication rates by introducing the use of multimodal sensor data to inform state of the art language prediction. For the first time a VOCA system will not only predict words and phrases; we aim to provide access to extended conversation by predicting narrative text elements tailored to an ongoing conversation.

In current systems users sometimes pre-store monologue 'talks', but sharing personal experiences (stories) interactively using VOCAs is rare. Being able to relate experience enables us to engage with others and allows us to participate in society. In fact, the bulk of our interaction with others is through the medium of conversational narrative, i.e. sharing personal stories. Several research projects have prototyped ways in which automatically gathered data and language processing can support disabled users to communicate easily and at higher rates. However, none have succeeded in harnessing the potential of such technology to design an integrated communication system which automatically extracts meaningful data from different sources, transforms this into conversational text elements and presents results in such a way that people with severe physical disabilities can manipulate and select conversational items for output through a speech synthesiser quickly and with minimal physical and cognitive effort.

This project will develop technology which will leverage contextual data (e.g. information about location, conversational partners and past conversations) to support language prediction within an onscreen user interface which will adapt depending on the conversational topic, the conversational partner, the conversational setting and the physical ability of the nonspeaking person. Our aim is to improve the communication experience of nonspeaking people by enabling them to tell their stories easily, at more acceptable speeds.
Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.dundee.ac.uk