EPSRC logo

Details of Grant 

EPSRC Reference: EP/M002632/1
Title: Interaction-based Human Motion Analysis
Principal Investigator: Shum, Dr H
Other Investigators:
Researcher Co-Investigators:
Project Partners:
Cadence Cycling Performance Centre Kinesio UK Nine Health CIC
Department: Fac of Engineering and Environment
Organisation: Northumbria, University of
Scheme: First Grant - Revised 2009
Starts: 27 February 2015 Ends: 26 October 2016 Value (£): 99,055
EPSRC Research Topic Classifications:
Biomechanics & Rehabilitation Computer Graphics & Visual.
EPSRC Industrial Sector Classifications:
Healthcare Creative Industries
Related Grants:
Panel History:
Panel DatePanel NameOutcome
09 Sep 2014 EPSRC ICT Prioritisation Panel - Sept 2014 Announced
Summary on Grant Application Form
In this project, we propose a new method to analyze human motion based on the interaction with the surrounding environment, which provides a better understanding about the nature of the performed motion and enhances the performance of modern motion-related applications.



Understanding human movement is a central problem for motion-related applications such as behaviour monitoring for smart homes and movement evaluation for physical therapy. Most traditional 3D motion analysis algorithms are human-centered, meaning that they only consider features of the human body but not the interaction with the surrounding environment. Imagine the movement of an older person standing on the floor while doing some arm-stretching exercises, and another standing on a chair to fix a light bulb, which is considered to be dangerous. The two motions have completely different contextual meanings, but are surprisingly similar in terms of human body posture. Traditional computer-based motion analysis methods disregard the relationship between the human and the environment, and thus cannot accurately tell the difference between the two motions.



We observe that real humans usually comprehend the context of a motion based on its interaction with the surrounding environment, such as sitting on a sofa, watching the television, riding a bicycle, etc. We believe that by considering these types of interactions, higher quality motion analysis can be improved as a result of better understanding on the context of the motion. Therefore, we propose a new algorithm to analyze human motion based on the interaction between the human and the surrounding environment, which will enhance the accuracy of motion identification and the performance of movement evaluation. Our algorithm evaluates detailed 3D movement features with respect to the environment, such as analyzing the subtle movement of the feet of a Parkinson's disease patient with respect to the position of the stairs during a stair climbing motion. It can therefore (1) analyze movement features from the interaction point of view, and (2) identify what kind of motion the user is performing based on the interaction context.



The system proposed in this research could be used to enhance the quality of life of older people to enable greater independence and reduce the burden on emergency and care services caused by a rising ageing population. Our algorithm accurately identifies human motion, which is an important step towards a smart home system that takes care of older people autonomously. It also aims to evaluate human movement, which can significantly reduce the labour cost of rehabilitation and coaching for older people, as well as early stage motion-related disease (such as the Parkinson's disease) diagnosis.

Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: