EPSRC logo

Details of Grant 

EPSRC Reference: EP/X019632/1
Title: Insect-inspired depth perception
Principal Investigator: Webb, Professor B
Other Investigators:
Researcher Co-Investigators:
Project Partners:
Festo SE & Co.KG Opteran Technologies Ltd
Department: Sch of Informatics
Organisation: University of Edinburgh
Scheme: Standard Research
Starts: 01 February 2023 Ends: 31 January 2027 Value (£): 490,506
EPSRC Research Topic Classifications:
Bioelectronic Devices Image & Vision Computing
Robotics & Autonomy Vision & Senses - ICT appl.
EPSRC Industrial Sector Classifications:
Information Technologies
Related Grants:
EP/X019705/1
Panel History:
Panel DatePanel NameOutcome
20 Sep 2022 EPSRC ICT Prioritisation Panel September 2022 Announced
Summary on Grant Application Form
Any animal, or robot, that wants to interact with objects needs to obtain information about their 3D shape. Humans use stereo vision (two views from two eyes) to gain information about depth, but require large brains to process this information. Robots have also been built that use stereo vision, or other kinds of depth sensors that use projected light or reflected light. But these have a number of limitations, such as energy consumption, sensitivity to lighting conditions, and the amount of computational processing needed. We are interested how insects solve the problem of 3D sensing, with small compound eyes and a tiny brain (altogether ~100,000 neurons), and whether this provides an alternative solution for robotics.

Insects such as fruit flies (Drosophila) can be studied with high-speed/high-resolution neural activity and behaviour recordings. This has revealed they use a special mechanism to get depth information, which involves motion of the individual light receptors in the eye. Eyes (unlike conventional cameras) register relative light change. In Drosophila, individual light sensitive cells - corresponding to individual "pixels" of the scene - react to these light changes by generating an fast counter-motion, which we call a photoreceptor microsaccade. Each photoreceptor moves in a specific direction at its particular location inside the compound eye, transiently readjusting its own light input. The photoreceptor microsaccades are mirror-symmetric in the left and right eyes, meaning that the same light change makes them move simultaneously in opposite directions. Therefore, during binocular viewing, the pixels in one eye move transiently with the world and in the other eye against it. Ultimately, these opposing microsaccades should cause small timing differences in the eye and the brain networks' electrical signals, rapidly and accurately informing the fly of the 3D world structure.

We now want to determine exactly how the Drosophila brain networks utilise this mirror-symmetric left and right eye information to produce super-resolution stereo vision. We will build realistic models of binocular stereo information processing in the fly and use these to reproduce and predict responses to 3D objects. We will test the efficiency of this encoding in Artificial Neural Network (ANN) simulations driven by microsaccadic sampling. This approach will be combined with experiments on Drosophila that monitor neural activity using 3D object stimulation, and use behavioural tests to reveal the animal's 3D perception capabilities. Our hypotheses about function will then be realised and tested in hardware, to determine if the same depth sensing capabilities can be obtained using either conventional camera input processed in a novel way, or through the design of a novel light sensing array that incorporates individual movement of the elements. The outcome will be a new method to efficiently detect 3D shape, which would have multiple potential applications, e.g. for robot grasping tasks.
Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.ed.ac.uk