EPSRC logo

Details of Grant 

EPSRC Reference: EP/R005605/1
Title: iSee - Intelligent Vision for Grasping
Principal Investigator: Siebert, Dr J
Other Investigators:
Williamson, Dr J Aragon-Camarasa, Dr G
Researcher Co-Investigators:
Project Partners:
Department: School of Computing Science
Organisation: University of Glasgow
Scheme: Technology Programme
Starts: 01 January 2017 Ends: 31 March 2018 Value (£): 149,217
EPSRC Research Topic Classifications:
Image & Vision Computing Robotics & Autonomy
EPSRC Industrial Sector Classifications:
No relevance to Underpinning Sectors
Related Grants:
Panel History:  
Summary on Grant Application Form
Intelligent vision is a key enabler for future robotics technology. Shadow's recent development work on a disruptive universal gripper (launched at Innovate16) has identified a need for better vision to permit the automation of grasping. Building on R&D at the University of Glasgow for over 20 years, the iSee project will establish concrete robot vision benchmarks, based on commercially relevant scenes, and then develop, validate and integrate vision sensors and processing algorithms into the Smart Grasping System (SGS) to enable it to reach significant new markets in automation, logistics and service robotics. The following candidate sensors have been selected for benchmarking:

A. Low-cost time of Flight (TOF) 3D cameras (available from various companies)

B. Stereo pairs of off-the-shelf HD cameras and stereo pairs of embedded vision sensors in conjunction with CVAS's existing custom stereo-pair image matching and photogrammetry software

C. An Asus Xtion RGBD camera will serve as a benchmark reference sensor

We propose to build an integrated hand-eye system for each sensor listed above, along with appropriate lighting, and to develop complete integrated pipelines to benchmark the different combinations of capture and analysis systems on the specified scenarios. This investigation will allow us to determine the performance of 2D and 3D sensing methods in terms of quality of image capture and hand-eye performance.

We propose also to evaluate the new and highly disruptive Deep Convolutional Neural net technology (DCNNs) that has the potential to leapfrog the best algorithmic vision methods and to provide a fast, accurate and complete vision solution that meets the demands of advanced robotic grasping and manipulation. We will thus augment the evaluation with potentially very efficient and highspeed DCNN algorithms for interpreting images using potentially low-cost sensors for:

* Detecting and localising known objects and estimating their pose for grasping purposes

* Estimating depth, size and surface normals directly from single monocular images, using transfer methods

* Recovering depth from binocular and monocular camera systems using stereo matching and structure from motion (optical flow) respectively

Once trained, DCNNs can analyse images very quickly and are now becoming suitable for low-cost embedded platforms, such as smartphones. This aspect of the proposed investigation has the potential to massively simplify the sensor in terms of hardware only single cameras, or stereo pairs of cameras, are required in combination with DCCNs as the basis for a vision system which might potentially provide all of the functionality required to control the hand in a wide range of scenarios.

Benchmark results will let us develop specific camera-algorithm combinations to improve performance for the specified use cases over a number of test-evaluate-improve iterations. The core 3D sensing approaches will be integrated with the SGS, and we shall evaluate additional cameras situated off-hand providing critical ancillary visual input when objects are close to the gripper camera prior to a grasp, or during in-hand manipulation. Both camera systems will be used to acquire different views of the scene, with both systems mounted on a robot arm for interactive perception of the scene and objects contained within it.

In parallel, we will develop a showcase demonstration system at Shadow based on Shadow's current grasp planning software coupled to the 3D images captured by the benchmarked 3D vision systems. The developed vision modules will be encapsulated within the "Blocky" programming system to afford a simple and direct method for end-users to take advantage of this capability.

In conclusion, we believe that the robotics hand-eye pipelines proposed within the iSee project have the potential to play an important role in maintaining market leadership in the development of complete robotics system solutions.
Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.gla.ac.uk