EPSRC logo

Details of Grant 

EPSRC Reference: EP/R03298X/1
Title: ExTOL: End to End Translation of British Sign Language
Principal Investigator: Bowden, Professor R
Other Investigators:
Woll, Professor B Cormier, Professor KA Zisserman, Professor A
Researcher Co-Investigators:
Project Partners:
BBC Catholic (Radboud) University Foundation European Union of the Deaf
Inter College of Therapeutic Education University of Hamburg
Department: Vision Speech and Signal Proc CVSSP
Organisation: University of Surrey
Scheme: Standard Research
Starts: 01 July 2018 Ends: 30 June 2022 Value (£): 971,921
EPSRC Research Topic Classifications:
Artificial Intelligence Computational Linguistics
Image & Vision Computing
EPSRC Industrial Sector Classifications:
Creative Industries Education
Related Grants:
Panel History:
Panel DatePanel NameOutcome
06 Feb 2018 ICT Cross-Disciplinarity and Co-Creation Announced
Summary on Grant Application Form
British Sign Language (BSL) is the natural language of the British Deaf community and is as rich and expressive as any spoken language. However, BSL is not just English words converted into hand motions. It is a language in its own right, with its own grammar, very different from English. Also BSL uses different elements of the body simultaneously. Not just the movement and shape of the hands but the body, face, mouth and the space around the signer are all used to convey meaning.

Linguistic study of sign languages is quite new compared to spoken languages, having begun only in the 1960s. Linguists are very interested in sign languages because of what they can reveal about the possibilities of human language that don't rely at all on sound. One of the problems is that studying sign languages involves analysing video footage - and because sign languages lack any standard writing or transcription system, this is extremely labour-intensive. This project will develop computer vision tools to assist with video analysis. This will in turn help linguists increase their knowledge of the language with a long term ambition of creating the world's first machine readable dataset of a sign language, a goal that was achieved for large amounts of text of spoken language in the 1970s.

The ultimate goal of this project is to take the annotated data and understanding from linguistic study and to use this to build a system that is capable of watching a human signing and turning this into written English. This will be a world first and an important landmark for deaf-hearing communication. To achieve this the computer must be able to recognise not only hand motion and shape but the facial expression and body posture of the signer. It must also understanding how these aspects are put together into phrases and how these can be translated into written/spoken language.

Although there have been some recent advances in sign language recognition via data gloves and motion capture systems like Kinect, part of the problem is that most computer scientists in this research area do not have the required in-depth knowledge of sign language. This project is therefore a strategic collaboration between leading experts in British Sign Language linguistics and software engineers who specialise in computer vision and machine learning, with the aim of building the world's first British Sign Language to English Translation system and the first practically functional machine translation system for any sign language.
Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.surrey.ac.uk