EPSRC logo

Details of Grant 

EPSRC Reference: EP/M019284/1
Title: An Integrated Vision and Control Architecture for Agile Robotic Exploration
Principal Investigator: Dudek, Professor P
Other Investigators:
Researcher Co-Investigators:
Dr S J Carey
Project Partners:
BAE Systems Blue Bear Systems Research Ltd Defence Science & Tech Lab DSTL
TRW
Department: Electrical and Electronic Engineering
Organisation: University of Manchester, The
Scheme: Standard Research
Starts: 01 September 2015 Ends: 31 January 2021 Value (£): 858,324
EPSRC Research Topic Classifications:
Artificial Intelligence Control Engineering
Image & Vision Computing Robotics & Autonomy
EPSRC Industrial Sector Classifications:
Aerospace, Defence and Marine Electronics
Related Grants:
EP/M019454/1
Panel History:
Panel DatePanel NameOutcome
25 Feb 2015 Engineering Prioritisation Panel Meeting 25 February 2015 Announced
Summary on Grant Application Form
Autonomous robots, capable of independent and intelligent navigation through unknown environments, have the potential to significantly increase human safety and security. They could replace people in potentially hazardous tasks, for instance search and rescue operations in disaster zones, or surveys of nuclear/chemical installations. Vision is one of the primary senses that can enable this capability, however, visual information processing is notoriously difficult, especially at speeds required for fast moving robots, and in particular where low weight, power dissipation and cost of the system are of concern. Conventional hardware and algorithms are not up to the task. The proposal here is to tightly integrate novel sensing and processing hardware, together with vision, navigation and control algorithms, to enable the next generation of autonomous robots.

At the heart of the system will be a device known as a 'vision chip'. This bespoke integrated circuit differs from a conventional image sensor, including a processor with each pixel. This will offer unprecedented performance. The massively parallel processor array will be programmed to pre-process images, passing higher-level feature information upstream to vision tracking algorithms and the control system. Feature extraction at pixel level results in an extremely efficient and high speed throughput of information. Another feature of the new vision chip will be the measurement of 'time of flight' data in each pixel. This will allow the distance to a feature to be extracted and combined with the image plane data for vision tracking, simplifying and speeding up the real-time state estimation and mapping capabilities. Vision algorithms will be developed to make the most optimal use of this novel hardware technology.

This project will not only develop a unique vision processing system, but will also tightly integrate the control system design. Vision and control systems have been traditionally developed independently, with the downstream flow of information from sensor through to motor control. In our system, information flow will be bidirectional. Control system parameters will be passed to the image sensor itself, guiding computational effort and reducing processing overheads. For example a rotational demand passed into the control system, will not only result in control actuation for vehicle movement, but will also result in optic tracking along the same path. A key component of the project will therefore be the management and control of information across all three layers: sensing, visual perception and control. Information share will occur at multiple rates and may either be scheduled or requested. Shared information and distributed computation will provide a breakthrough in control capabilities for highly agile robotic systems.

Whilst applicable to a very wide range of disciplines, our system will be tested in the demanding field of autonomous aerial robotics. We will integrate the new vision sensors onboard an unmanned air vehicle (UAV), developing a control system that will fully exploit the new tracking capabilities. This will serve as a demonstration platform for the complete vision system, incorporating nonlinear algorithms to control the vehicle through agile manoeuvres and rapidly changing trajectories. Although specific vision tracking and control algorithms will be used for the project, the hardware itself and system architecture will be applicable to a very wide range of tasks. Any application that is currently limited by tracking capabilities, in particular when combined with a rapid, demanding control challenge would benefit from this work. We will demonstrate a step change in agile, vision-based control of UAVs for exploration, and in doing so develop an architecture which will have benefits in fields as diverse as medical robotics and industrial production.

Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.man.ac.uk