EPSRC logo

Details of Grant 

EPSRC Reference: EP/D003180/1
Title: Empathic avatars: input, processing and output of emotional state
Principal Investigator: Dodgson, Professor NA
Other Investigators:
Robinson, Professor P
Researcher Co-Investigators:
Project Partners:
BT Hi8us
Department: Computer Science and Technology
Organisation: University of Cambridge
Scheme: Standard Research (Pre-FEC)
Starts: 01 October 2005 Ends: 30 September 2009 Value (£): 264,965
EPSRC Research Topic Classifications:
Human-Computer Interactions
EPSRC Industrial Sector Classifications:
Creative Industries
Related Grants:
EP/D505542/1
Panel History:  
Summary on Grant Application Form
Emotion is important in human interaction. From a range of subtle physical and aural cues we make judgements about the emotional state of others and modify our behaviour accordingly. Such cues are currently missing from automated computer animation and are not collected by any commercial computer system.We propose research into the automatic identification and tracking of a person's emotional state, using a variety of inputs, and the automatic addition of non-verbal emotional cues to human-figure animation. These two aspects, input and output, are tied together in applications where a participant in a shared virtual reality is represented by an animated character or avatar. Examples of this include remote collaboration, virtual acting (for example, actors rehearsing in separate cities), on-line shopping, networked games and chat rooms (where three-dimensional avatars represent participants). Examples of the two aspects being used separately are customer relations (emotional input providing information to an automated service system) and personal computing (on the input side, a computer reacting to a person's emotional state and, on the output side, the computer displaying an animated avatar which can exhibit non-verbal emotional cues).The project has four strands:(1) We will infer emotional state from a number of variables that can be obtained in real-time from participants - such as their facial expression, voice, heart rate, skin conductance, and respiration. This information will be fused together based on a computational emotional model. Part of this strand will be the development of an appropriate emotional model based on existing psychological research.(2) We will undertake research into the extraction of emotional cues from body language, a hitherto little explored area which has great relevance to the acting rehearsal application in strand (4). We will use Cambridge's Active Bat system as a lightweight input mechanism.(3) The emotional model from strand (1) will be used to drive aspects of an animated character or avatar, in particular allowing the inferred emotional state to influence how avatar animations are rendered, taking into account both the real tracked movements of the person and the additional emotional information.(4) The whole system will then be tested with an acting rehearsal application, building on top of UCL previous work in this area. Acting rehearsal is probably the most demanding application in terms of accurate emotional feedback and thus provides a challenging environment in which to test the results of the other three strands.
Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL: http://www.cl.cam.ac.uk/research/rainbow/emotions/
Further Information:  
Organisation Website: http://www.cam.ac.uk