EPSRC logo

Details of Grant 

EPSRC Reference: EP/S00453X/1
Title: MAN^3: huMAN-inspired robotic MANipulation for advanced MANufacturing
Principal Investigator: Jamone, Dr L
Other Investigators:
Researcher Co-Investigators:
Project Partners:
DeepMind Ocado Group The Shadow Robot Company
Department: Sch of Electronic Eng & Computer Science
Organisation: Queen Mary University of London
Scheme: New Investigator Award
Starts: 01 January 2019 Ends: 30 June 2022 Value (£): 310,597
EPSRC Research Topic Classifications:
Robotics & Autonomy
EPSRC Industrial Sector Classifications:
Manufacturing
Related Grants:
Panel History:
Panel DatePanel NameOutcome
13 Jun 2018 Engineering Prioritisation Panel Meeting 13 and 14 June 2018 Announced
Summary on Grant Application Form
Across the past 50 years, the use of robots in industry has monotonically increased, and it has literally boomed in the last 10 years. In 2016, the average robot density (i.e. number of robot units per 10,000 employees) in the manufacturing industries worldwide was 74; by regions, this was 99 units in Europe, 84 in the Americas and 63 in Asia, with an average annual growth rate (between 2010 and 2016) of 9% in Asia, 7% in the Americas and 5% in Europe. From 2018 to 2020, global robot installations are estimated to increase by at least 15% on average per year.

The main market so far has been the automotive industry (i.e. an example of heavy manufacturing), where simple and repetitive robotic manipulation tasks are performed in very controlled settings by big and expensive robots, in dedicated areas of the factories where human workers are not allowed to enter for safety reasons. New growing markets for robots are consumer-electronics and food/beverages (i.e. examples of light manufacturing) as well as other small and medium sized enterprises (SMEs): in particular, the food and beverage industry has increased robot orders by 12% each year between 2011 and 2015, and by 20% in 2016. However, in many cases the production processes of these industries require delicate handling and fine manipulations of several different items, posing serious challenges to the current capabilities of commercial robotic systems.

With 71 robot units per 10,000 employees (in 2016), the UK is the only G7 country with a robot density below the world average of 74, ranking 22nd. The industry and SME sector is highly in need of a modernization that would increase productivity and improve the working conditions (e.g. safety, engagement) of the human workers: this requires the development and deployment of novel robotic technologies that could meet the needs of those businesses in which current robots are yet not effective.

One of the main reasons why robots are not effective in those applications is the lack of robot intelligence: the ability to learn and adapt that is typical of humans. Indeed, robotic manipulation can be enhanced by relying on humans, both through interaction (i.e. humans as direct teachers) and through inspiration (i.e. humans as models).

Therefore, the aim of this project is to develop a system for natural human demonstration of robotic manipulation tasks, combining immersive Virtual Reality technologies and smart wearable devices (to interface the human with the robot) with robot sensorimotor learning techniques and multimodal artificial perception (inspired by the human sensorimotor system). The robotic system will include a set of sensors that allow to reconstruct the real world, in particual by integrating 3D vision with tactile information about contacts; the human user will access this artificial reconstruction through an immersive Virtual Reality that will combine both visual and haptic feedback. In other words, the user will see through the eyes of the robot, and will feel through the hands of the robot. Also, users will be able to move the robot just by moving their own limbs. This will allow human users to easily teach complex manipulation tasks to robots, and robots to learn efficient control strategies from the human demonstrations, so that they can then repeat the task autonomously in the future.

Human demonstration of simple robotic tasks has already found its way to industry (e.g. robotic painting, simple pick and place of rigid objects), but still it cannot be applied to the dexterous handling of generic objects (e.g. soft and delicate objects), that would result in a much larger applicability (e.g. food handling). Therefore, the expected results of this project will boost productivity in a large number of industrial processes (economic impact) and improve working conditions and quality of life of the human workers in terms of safety and engagement (social impact).
Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: