EPSRC logo

Details of Grant 

EPSRC Reference: EP/V020579/1
Title: Turing AI Fellowship: Event-Centric Framework for Natural Language Understanding
Principal Investigator: He, Professor Y
Other Investigators:
Researcher Co-Investigators:
Project Partners:
Actable AI Ltd AstraZeneca Google
University of Edinburgh
Department: Computer Science
Organisation: University of Warwick
Scheme: EPSRC Fellowship - NHFP
Starts: 01 January 2021 Ends: 30 September 2022 Value (£): 1,269,626
EPSRC Research Topic Classifications:
Artificial Intelligence Information & Knowledge Mgmt
EPSRC Industrial Sector Classifications:
Information Technologies
Related Grants:
Panel History:
Panel DatePanel NameOutcome
06 Oct 2020 Turing AI Acceleration Fellowship Interview Panel B Announced
Summary on Grant Application Form
Natural language understanding (NLU) aims to allow computers to understand text automatically. NLU may seem easy to humans, but it is extremely difficult for computers because of the variety, ambiguity, subtlety, and expressiveness of human languages. Recent efforts to NLU have been largely exemplified in tasks such as natural language inference, reading comprehension and question answering. A common practice is to pre-train a language model such as BERT on large corpora to learn word representations and fine-tune on task-specific data. Although BERT and its successors have achieved state-of-the-art performance in many NLP tasks, it has been found that pre-trained language models mostly only reason about the surface form of entity names and fail to capture rich factual knowledge. Moreover, NLU models built on such pre-trained language models are susceptible to adversarial attack that even a small perturbation of an input (e.g., paraphrase questions and/or answers in QA tasks) would result in dramatic decrease in models' performance, showing that such models largely rely on shallow cues.

In human reading, successful reading comprehension depends on the construction of an event structure that represents what is happening in text, often referred to as the situation model in cognitive psychology. The situation model also involves the integration of prior knowledge with information presented in text for reasoning and inference. Fine-tuning pre-trained language models for reading comprehension does not help in building such effective cognitive models of text and comprehension suffers as a result.

In this fellowship, I aim to develop a knowledge-aware and event-centric framework for natural language understanding, in which event representations are learned from text with the incorporation of prior background and common-sense knowledge; event graphs are built on-the-fly as reading progresses; and the comprehension model is self-evolved to understand new information. I will primarily focus on reading comprehension and my goal is to enable computers to solve a variety of cognitive tasks that mimic human-like cognitive capabilities, bringing us a step closer to human-like intelligence.
Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Impacts
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Summary
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.warwick.ac.uk