EPSRC Reference: 
EP/W007886/1 
Title: 
Uncertainty Quantification at the Exascale (EXAUQ) 
Principal Investigator: 
Challenor, Professor P 
Other Investigators: 

Researcher CoInvestigators: 

Project Partners: 

Department: 
Mathematics 
Organisation: 
University of Exeter 
Scheme: 
Standard Research 
Starts: 
02 August 2021 
Ends: 
01 August 2024 
Value (£): 
1,006,031

EPSRC Research Topic Classifications: 
Computer Sys. & Architecture 
Mathematical Analysis 
Software Engineering 


EPSRC Industrial Sector Classifications: 
No relevance to Underpinning Sectors 


Related Grants: 

Panel History: 

Summary on Grant Application Form 
Exascale computing offers the prospect of running numerical models, for example of nuclear fusion and the climate, at unprecedented resolution and fidelity, but such models are still subject to uncertainty and we need to able to quantify such uncertainties (and for example use data on model outputs to calibrate the model inputs). Exascale computing comes at a cost. We will never be able to run huge ensembles go models on Exascale computers. Naive methods, such as Monte Carlo where we simply sample from the probability distribution of the model inputs, run a huge ensemble of models and produce a sample from the output distribution, are not going to be feasible. We need to develop uncertainty quantification methodology that allows us to efficiently, and effectively, perform sensitivity and uncertainty calculations with the minimum number of exascale model runs.
Our methods are based on the idea of an emulator. An emulator is a statistical approximation linking model inputs and outputs in a fast nonlinear way. It also includes a measure of its own uncertainty so we know how well it is approximating the original numerical model. Our emulators are based on Gaussian processes. Normally we would run a designed experiment and use these results to train the emulator. Because of the cost of exascale computing we use a hierarchy of models from fast, low fidelity versions through higher fidelity more computationally expensive ones to the very expensive, very high fidelity one at the apex of the hierarchy. Building a joint emulator for all the models in the hierarchy allows us to gain strength from the low fidelity ones to emulate the exascale models. Although such ideas have been around for a number of years they have not been exploited much for very large models.
We will expand on the existing theory on a number of new ways. First we will look at the problem of design. To exploit the hierarchy to its fullest extent we need an experimental design that allocates model runs to the correct layer of the model hierarchy. We will extend existing sequential design methodology to work with hierarchies of model, not only finding the optimal next set of inputs for running the model but also which level it should be run in. We will also ensure that the sequential design is 'batch' sequential, allowing us to run ensembles rather than waiting for each run to return answers.
Because the inputs and outputs of exascale models are often fields of correlated values we will develop methods for handling such high dimensional inputs and outputs and how to relate them to other levels of the hierarchy.
Finally we will investigate whether AI methods other than Gaussian processes can be used to build efficient emulators.

Key Findings 
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk

Potential use in nonacademic contexts 
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk

Impacts 
Description 
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk 
Summary 

Date Materialised 


Sectors submitted by the Researcher 
This information can now be found on Gateway to Research (GtR) http://gtr.rcuk.ac.uk

Project URL: 

Further Information: 

Organisation Website: 
http://www.ex.ac.uk 