EPSRC logo

Details of Grant 

EPSRC Reference: EP/T00603X/1
Title: RoboPatient - Robot assisted learning of constrained haptic information gain
Principal Investigator: Nanayakkara, Professor T
Other Investigators:
Van Zalk, Dr N Ghajari, Dr M
Researcher Co-Investigators:
Project Partners:
Department: Design Engineering (Dyson School)
Organisation: Imperial College London
Scheme: Standard Research
Starts: 01 December 2019 Ends: 31 July 2024 Value (£): 1,076,801
EPSRC Research Topic Classifications:
Image & Vision Computing Med.Instrument.Device& Equip.
Robotics & Autonomy Vision & Senses - ICT appl.
EPSRC Industrial Sector Classifications:
No relevance to Underpinning Sectors
Related Grants:
EP/T004509/1 EP/T00519X/1
Panel History:
Panel DatePanel NameOutcome
09 Jul 2019 HT Investigator-led Panel Meeting - July 2019 Announced
Summary on Grant Application Form
Primary examination of a patient by a physician often involves physical examination to estimate the condition of internal organs in the abdomen. It is tacit knowledge among physicians that the skill to combine haptic cues with visual reaction feedback during physical examination of a patient is key to establish trust and to improve the accuracy of preliminary diagnosis. Clinical sessions in medical training involve either using real patients or static rubber mannequins. Both methods are opaque to some key information - such as how variation of physician's finger stiffness affects stress levels at organ boundaries. Moreover, feedback given in ward demonstrations is often qualitative. The ambition of this project is to investigate how a functional robotic patient can establish an efficient link between an expert demonstrator and a trainee physician to develop palpation skills that the two human counterparts cannot establish using a real patient.



In this project, we introduce the first generation of a functional robotic patient with sensorised and controllable internal organs together with a detailed finite element-based abdominal tissue model to visualise enhanced sensor data. The human-robot interaction will be enriched using augmented reality based facial reactions during physical examination of the robotic patient. This will help trainee physicians to derive deeper insights into the demonstrations from a tutor. A robotic patient would also allow us to quantify and reinforce established techniques - for example starting distant regions to "calibrate" the peripheral sensory system, while learning what represents optimal approaches - such as pressing just enough with the right frequency. The experiential feedback loop will, therefore, allow us to quantify the improvement.

The main scientific outcome of this project will be a deeper understanding of how an expert physician could use a robot-assisted approach to allow a trainee physician to efficiently acquire the skill of manual palpation subject to visually perceived constraints. First trials will be done at the University of Surrey, with a subsequent introduction to other General Practitioner trainers and trainees through Royal College of General Practitioners workshops. Student and tutor feedback from pilot trials will be used to improve the robo-patient design in a user-centred co-design framework.

Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.imperial.ac.uk