EPSRC logo

Details of Grant 

EPSRC Reference: EP/F023405/1
Title: GAIME: Gestural and Audio Interactions for Mobile Environments
Principal Investigator: Brewster, Professor SA
Other Investigators:
Murray-Smith, Professor R
Researcher Co-Investigators:
Project Partners:
Nokia
Department: School of Computing Science
Organisation: University of Glasgow
Scheme: Standard Research
Starts: 01 November 2007 Ends: 30 April 2011 Value (£): 367,099
EPSRC Research Topic Classifications:
Mobile Computing
EPSRC Industrial Sector Classifications:
Communications
Related Grants:
Panel History:
Panel DatePanel NameOutcome
18 Oct 2007 ICT Prioritisation Panel (Technology) Announced
Summary on Grant Application Form
Most PDAs and smart phones have sophisticated graphical interfaces and commonly use small keyboards or styli for input. The range of applications and services for such devices is growing all the time. However, there are problems which make interaction difficult when a user is on the move. Much visual attention is needed to operate many of the applications, which may not be available in mobile contexts. Oulasvirta et al. [29] showed that attention can become very fragmented for users on the move as it must shift between navigating the environment and the device, making interaction hard. Our own research has shown that performance may drop by more than 20% when users are mobile [4]. Another important issue is that most devices require hands to operate many of the applications. These may not be available if the user is carrying bags, holding on to children or operating machinery, for example. The novel aspect of this proposal is to reduce the reliance on graphical displays and hands by investigating gesture input from other locations on the body com-bined with three-dimensional sound for output.Little work has gone into making input and control hands-free for mobile users. Speech recognition is still problematic in such settings due to its high processing requirements and the dynamic audio environments in which devices are used. Much of the research on ges-ture input still uses hands for making the gestures. There is some work on head-based input, often for users with disabilities [26], but little of this has been used in mobile settings. Our own previous work has begun to examine head pointing and showed that it might be a useful way to point and select on the move [3].Many other body locations could be useful for subtle and discreet input whilst mobile (e.g., users walking or sitting on a bumpy train). For example, wrist rotation has potential for controlling a radial menu as the wrist can be rotated to move a pointer across the menu. It is unobtrusive and could be tracked using the same sensor used for hand pointing gestures (in a watch for exam-ple). Small changes in gait are also a possibility for in-teraction. In previous work [12] we extracted gait in-formation from an accelerometer on a PDA to look at usability errors. We can adapt this technique so that users could slightly change the timing of a step to make input. There has been no systematic study of the differ-ent input possibilities across the body. We will develop a novel testing methodology using a Fitts' law analysis along with more subjective measures to find out which body locations are most useful for input on the move.Output is also a problem due to the load on visual atten-tion when users are mobile. We and others have begun to look at the use of spatialised audio cues for output when mobile as an alternative or complement to graph-ics [1, 6] [19, 32]. Many of these use very simple 3D audio displays, but, with careful design, spatial audio could provide a much richer display space. Our Audio-Clouds project built some foundations for 3D audio interactions, investigating basic pointing behaviour, target size and separation [1,3]. We need to now take this work forward and develop more sophisticated inter-actions. Key aspects here are to develop the use of ego-centric (fixed to the user) and exocentric (fixed to the world) displays, and how they can be combined to cre-ate a rich 3D display space for interaction.The final key part of this project is to create compelling applications which combine the best of the audio and gestures. We can then test these with users in more realistic settings over longer time periods to fine-tune how these interactions work in the real world.
Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.gla.ac.uk