EPSRC logo

Details of Grant 

EPSRC Reference: EP/V061755/1
Title: iSee: Intelligent Sharing of Explanation Experience by Users for Users
Principal Investigator: Wiratunga, Professor NC
Other Investigators:
Martin, Mr K Corsar, Dr D
Researcher Co-Investigators:
Miss A Wijekoon
Project Partners:
Department: School of Comp Sci & Digital Media
Organisation: Robert Gordon University
Scheme: Standard Research - NR1
Starts: 01 March 2021 Ends: 29 February 2024 Value (£): 272,583
EPSRC Research Topic Classifications:
EPSRC Industrial Sector Classifications:
Related Grants:
Panel History:  
Summary on Grant Application Form
The iSee Project will show how users of Artificial Intelligence (AI) can capture, share and re-use their experiences of AI explanations with other users who have similar explanation needs.

To clarify this further, let us use the phrase 'explanation strategy' to refer collectively to algorithms and visualization methods for explaining the predictions of models that have been built by Machine Learning (ML). We recognise that such strategies can be foundational, of the kind found in the research literature. However, user needs are often multi-faceted, and real-world applications and different users can require composite strategies formed from combinations of the basic building blocks provided by one or more of the foundational strategies.

We hypothesise that an end-user's explanation experience (like a lot of other problem-solving experience), must contain implicit knowledge that was required to solve their explanation need such as the preferred strategy (foundational or composite) and, in the case of composites, the manner of combination. What we will provide is the necessary platform to capture experiences by enabling users to interact with, experiment with, and evaluate explanations. Experiences once captured can be reused, on the premise that similar user needs can be met with similar explanation strategies. They help reinforce strategies for given circumstances whilst others can expose cases where a suitable strategy has yet to be discovered.

Our proposal describes in detail how we will develop an ontology for describing a library of explanation strategies; develop measures to evaluate their applicability and suitability; and design a representation to capture experiences of using explanation strategies. We explain how the case-based reasoning (CBR) paradigm can be used to discover composites and thereafter reuse them through algorithms that implement the main steps of a CBR cycle (retrieve, re-use, revise and retain); and why CBR is well placed to promote best practice in explainable AI. We include a number of high-impact use cases, where we work with real-world users to co-design the representations and algorithms described above and to evaluate and validate our approach. Our proposal also gives one possible route by which companies could certify compliance with explainable AI regulations and guidelines.
Key Findings
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Potential use in non-academic contexts
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Description This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Date Materialised
Sectors submitted by the Researcher
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk
Project URL:  
Further Information:  
Organisation Website: http://www.rgu.ac.uk