EPSRC logo

Details of Grant 

EPSRC Reference: EP/N012372/1
Title: TASCC: Pervasive low-TeraHz and Video Sensing for Car Autonomy and Driver Assistance (PATH CAD)
Principal Investigator: Gashinova, Professor M
Other Investigators:
Cherniakov, Professor M Gardner, Professor P
Researcher Co-Investigators:
Project Partners:
Department: Electronic, Electrical and Computer Eng
Organisation: University of Birmingham
Scheme: Standard Research - NR1
Starts: 01 December 2015 Ends: 31 March 2020 Value (£): 852,950
EPSRC Research Topic Classifications:
Image & Vision Computing Instrumentation Eng. & Dev.
Robotics & Autonomy
EPSRC Industrial Sector Classifications:
Transport Systems and Vehicles
Related Grants:
EP/N012240/1 EP/N012402/1
Panel History:
Panel DatePanel NameOutcome
28 Jul 2015 Towards Autonomy - Smart and Connected Control (Interview) Announced
Summary on Grant Application Form
This project combines novel low-THz (LTHz) sensor development with advanced video analysis, fusion and cross learning. Using the two streams integrated within the sensing, information and control systems of a modern automobile, we aim to map terrain and identify hazards such as potholes and surface texture changes in all weathers, and to detect and classify other road users (pedestrians, car, cyclists etc.).

The coming era of autonomous and assisted driving necessitates new all-weather technology. Advanced concepts of interaction between the sensed and processed data, the control systems and the driver can lead to autonomy in decision and control, securing all the needed information for the driver to intervene in critical situations. The aims are to improve road safety through increased situational awareness, and increase energy efficiency by reducing the emission of pollutants caused by poor control and resource use in both on and off-road vehicles.

Video cameras remain at the heart of our system: there are many reasons for this: low cost, availability, high resolution, a large legacy of processing algorithms to interpret the data and driver/passenger familiarity with the output. However it is widely recognized that video and/or other optical sensors such as LIDAR (c.f. Google car) are not sufficient. The same conditions that challenge human drivers such as heavy rain, fog, spray, snow and dust limit the capability of electro-optical sensors. We require a new approach.

The key second sensor modality is a low-THz radar system operating within the 0.3-1 THz frequency spectrum. By its very nature radar is robust to the conditions that limit video. However it is the relatively short wavelength and wide bandwidth of this LTHz radar with respect to existing automotive radar systems that can bring key additional capabilities. This radar has the potential to provide: (i) imagery that is closer to familiar video than those provided by a conventional radar, and hence can begin to exploit the vast legacy of image processing algorithms; (ii) significantly improved across-road image resolution leading to correspondingly significant improvements in vehicle, pedestrian and other 'actor' (cyclists, animals etc.) detection and classification; (iii) 3D images that can highlight objects and act as an input to the guidance and control system; (iv) analysis of the radar image features, such as shadows and image texture that will contribute to both classification and control.

The project is a collaboration between three academic institutions - the University of Birmingham with its long standing excellence in automotive radar research and radar technologies, the University of Edinburgh with world class expertise in signal processing and radar imaging and Heriot-Watt University with equivalent skill in video analytics, LiDAR and accelerated algorithms. The novel approach will be based on a fusion of video and radar images in a cross-learning cognitive process to improve the reliability and quality of information acquired by an external sensing system operating in all-weather, all-terrain road conditions without dependency on navigation assisting systems.

Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.bham.ac.uk