EPSRC logo

Details of Grant 

EPSRC Reference: EP/I031758/1
Title: Insect-inspired visually guided autonomous route navigation through natural environments
Principal Investigator: Philippides, Professor A
Other Investigators:
Researcher Co-Investigators:
Project Partners:
Department: Sch of Engineering and Informatics
Organisation: University of Sussex
Scheme: First Grant - Revised 2009
Starts: 01 May 2011 Ends: 30 April 2013 Value (£): 102,329
EPSRC Research Topic Classifications:
Control Engineering Robotics & Autonomy
EPSRC Industrial Sector Classifications:
No relevance to Underpinning Sectors
Related Grants:
Panel History:
Panel DatePanel NameOutcome
16 Feb 2011 Materials, Mechanical and Medical Engineering Announced
Summary on Grant Application Form
Our overall objective is to develop algorithms for long distance route-based visual navigation through complex natural environments. Despite recent advances in autonomous navigation, especially in map-based simultaneous localisation and mapping (SLAM), the problem of guiding a return to a goal location through unstructured, natural terrain is an open issue and active area of research. Despite their small brains and noisy low resolution sensors, insects navigate through such environments with a level of performance that outstrips state-of-the-art robot algorithms. It is therefore natural to take inspiration from insects. There has been a history of bio-inspired navigation models in robotics but there are known components of insect behaviour yet to be incorporated into engineering solutions. In contrast with most modern robotic methods, to navigate between two locations, insects, use procedural route knowledge and not mental maps. An important feature of route navigation is that the agent does not need to know where it is at every point (in the sense of localizing itself within a cognitive map), but rather what it should do. Insects provide further inspiration for navigation algorithms through their innate behavioural adaptations which simplify navigation through unstructured, cluttered environments.One objective is to develop navigation algorithms which capture the elegance and desirable properties of insect homing strategies - robustness (in the face of natural environmental variation), parsimony (of mechanism and visual encoding), speed of learning (insects must learn from their first excursion) and efficacy (the simple scale over which insects forage). Prior to this we will bring together current insights regarding insect behaviour with novel technologies which allow us to recreate visual input from the perspective of foraging insects. This will lead to new tools for biologists and increase our understanding of insect navigation. In order to achieve these goals our Work Packages will be:WP1 Development of tools for reconstructing large-scale natural environments. We will adapt an existing panoramic camera system to enable reconstruction of the visual input experienced by foraging bees. Similarly, we will adapt new computer vision methods to enable us to build world models of the cluttered habitats of antsWP2 Investigation of optimal visual encodings for navigation. Using the world model developed in WP1, we will investigate the stability and performance of different ways of encoding a visual sceneWP3 Autonomous route navigation algorithms. We will test a recently developed model of route navigation and augment it for robust performance in natural environmentsOur approach in this project is novel and timely. The panoramic camera system has just been developed at Sussex. The methods for building world models have only recently become practical and have not yet been applied in this context. The proposed route navigation methodology is newly developed at Sussex and is based on insights of insect behaviour only recently observed. Increased knowledge of route navigation will be of interest to engineers and biologists. Parsimonious route-following algorithms will be of use in situations where an agent must reliably navigate between two locations, such as a robotic courier or search-and-rescue robot. Our algorithms also have potential broader applications such as improving guidance aids for the visually-impaired. Biologists and the wider academic community will be able to use the tools developed to gain an understanding of the visual input during behavioural experiments leading to a deeper understanding of target systems. There is specific current interest from Rothamsted Agricultural Institute who are interested in how changes in flight patterns affect visual input and navigational efficacy of honeybee foragers from colonies affected by factors like pesticides or at risk of colony collapse disorder.
Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL: http://www.sussex.ac.uk/lifesci/insectnavigation
Further Information:  
Organisation Website: http://www.sussex.ac.uk