EPSRC logo

Details of Grant 

EPSRC Reference: EP/T014598/1
Title: CiViL: Common-sense- and Visually-enhanced natural Language generation
Principal Investigator: Gkatzia, Dr D
Other Investigators:
Researcher Co-Investigators:
Project Partners:
Emotech Ltd
Department: School of Computing
Organisation: Edinburgh Napier University
Scheme: New Investigator Award
Starts: 28 September 2020 Ends: 27 September 2023 Value (£): 280,060
EPSRC Research Topic Classifications:
Artificial Intelligence Computational Linguistics
Human-Computer Interactions Image & Vision Computing
EPSRC Industrial Sector Classifications:
No relevance to Underpinning Sectors
Related Grants:
Panel History:
Panel DatePanel NameOutcome
06 Nov 2019 EPSRC ICT Prioritisation Panel November 2019 Announced
Summary on Grant Application Form
One of the most compelling problems in Artificial Intelligence is to create computational agents capable of interacting in real-world environments using natural language. Computational agents such as robots can offer multiple benefits to society, for instance, they can be used to look after the ageing population, act as companions, can be used for skills training or even provide assistance in public spaces. These are extremely challenging tasks due to their complex interdisciplinary nature, which spans across several fields including Natural Language Generation, engineering, computer vision, and robotics.

Communication through language is the most vital and natural way of interaction. Humans are able to effectively communicate with each other using natural language, utilising common-sense knowledge and by making inferences about other people's backgrounds based on previous interactions with them. At the same time, they can successfully describe their surroundings, even when encountering unknown entities and object. For decades, researchers have tried to recreate the way humans communicate through natural language and although there are major breakthroughs during recent years (such as Apple's Siri or Amazon's Alexa), Natural Language Generation systems still lack the ability to reason, exploit common-sense knowledge, and utilise multi-modal information from a variety of sources such as knowledge bases, images, and videos.

This project aims to develop a framework for common-sense- and visually- enhanced Natural Language Generation that can enable natural real-time communication between humans and artificial agents such as robots to enable effective collaboration between humans and robots. Human-Robot Interaction poses additional challenges to Natural Language Generation due to uncertainty derived from the dynamic environments and the non-deterministic fashion of interaction. For instance, the viewpoint of a situated robot will change when the robot moves and hence its representation of the world, which will result in failure of current state-of-art methods, which are not able to adapt to changing environments. The project aims to investigate methods for linking various modalities, taking into account their dynamic nature. To achieve natural, efficient and intuitive communication capabilities, agents will also need to acquire human-like abilities in synthesising knowledge and expression. The conditions under which external knowledge bases (such as Wikipedia) can be used to enhance natural language generation still have to be explored as well as whether existing knowledge bases are useful for language generation.

The novel ways to integrate multi-modal data for language generation will lead to more robust and efficient interactions and will have an impact on natural language generation, social robotics, computer vision, and related fields. This might, in turn, spawn entirely novel applications, such as explaining exact procedures for e-health treatments and enhance tutoring systems for educational purposes.

Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.napier.ac.uk