EPSRC logo

Details of Grant 

EPSRC Reference: EP/N013964/1
Title: GLANCE: GLAnceable Nuances for Contextual Events
Principal Investigator: Mayol-Cuevas, Professor WW
Other Investigators:
Ludwig, Dr C Damen, Dr D Gilchrist, Professor ID
Researcher Co-Investigators:
Project Partners:
easyJet Airline Company Limited
Department: Computer Science
Organisation: University of Bristol
Scheme: Standard Research
Starts: 04 April 2016 Ends: 02 April 2021 Value (£): 806,994
EPSRC Research Topic Classifications:
Computer Graphics & Visual. Human-Computer Interactions
Image & Vision Computing Vision & Senses - ICT appl.
EPSRC Industrial Sector Classifications:
Information Technologies
Related Grants:
Panel History:
Panel DatePanel NameOutcome
04 Sep 2015 User Interaction with ICT Panel - Full Proposals Announced
Summary on Grant Application Form
This project will develop and validate exciting novel ways in which people can interact with the world via cognitive wearables -intelligent on-body computing systems that aim to understand the user, the context, and importantly, are prompt-less and useful. Specifically, we will focus on the automatic production and display of what we call glanceable guidance. Eschewing traditional and intricate 3D Augmented Reality approaches that have been difficult to show significant usefulness, glanceable guidance aims to synthesize the nuances of complex tasks in short snippets that are ideal for wearable computing systems and that interfere less with the user and that are easier to learn and use.

There are two key research challenges, the first is to be able to mine information from long, raw and unscripted wearable video taken from real user-object interactions in order to generate the glanceable supports. Another key challenge is how to automatically detect user's moments of uncertainty during which support should be provided without the user's explicit prompt.

The project aims to address the following fundamental problems:

1. Improve the detection of user's attention by robustly determining periods of time that correspond to task-relevant object interactions from a continuous stream of wearable visual and inertial sensors.

2. Provide assistance only when it is needed by building models of the user, context and task from autonomously identified micro-interactions by multiple users, focusing on models that can facilitate guidance.

3. Identify and predict action uncertainty from wearable sensing in particular gaze patterns and head motions.

4. Detect and weigh user expertise for the identification of task nuances towards the optimal creation of real-time tailored guidance.

5. Design and deliver glanceable guidance that acts in a seamless and prompt-less manner during task performance with minimal interruptions, based on autonomously built models.

GLANCE is underpinned by a rich program of experimental work and rigorous validation across a variety of interaction tasks and user groups. Populations to be tested include skilled and general population and for tasks that include: assembly, using novel equipment (e.g. an unknown coffee maker), and repair tasks (e.g. replacing a bicycle gear cable). It also tightly incorporates the development of working demonstrations.

And in collaboration with our partners the project will explore high-value impact cases related to health care towards assisted living and in industrial settings focusing on assembly and maintenance tasks.

Our team is a collaboration between Computer Science, to develop a the novel data mining and computer vision algorithms, and Behavioral Science to understand when and how users need support.

Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.bris.ac.uk