Room: Kilburn Building, Lecture Theatre 1.1


8:30-9:00 Arrivals

9:00-9:10 Welcome

9:10-10:10 Gaussian Process Emulators for Cardiac Digital Twins: Enabling Scalable Patient-Specific Modeling Steven Niederer, Imperial College London

10:10-10:40 Coffee Break

10:40-11:40 Bayesian Surrogate Modelling of Computer Experiments using Gaussian Processes Aleksandra Svalova, Newcastle University [slides]

11:40-12:40 Identifying Dynamic Systems for Digital Twins of Engineering Assets Timothy Rogers, University of Sheffield [slides]

12:40-14:00 Lunch

14:00-15:00 Gaussian process emulation to rigorously explore uncertainty in complex models of the atmosphere and climate Jill Johnson, University of Sheffield [slides]

15:00-15:30 Tea Break

15:30-16:30 Modelling London’s Air Quality Using Spatio-Temporal Gaussian Processes Ollie Hamelijnck, University of Warwick [slides]



Abstracts


Gaussian Process Emulators for Cardiac Digital Twins: Enabling Scalable Patient-Specific Modeling

Abstract

Cardiac digital twins are an in-silico representations of a patient’s heart, continually updated as new data emerges. These multi-scale physics and physiology constrained models, characterized by over 100 intricate parameters, demand substantial computational resources for their calibration and analysis. Historically, the systematic analysis of model sensitivity, model calibration, and uncertainty estimation for a multitude of patient-specific computational models has been computationally infeasible. In this presentation Prof. Steven Niederer will share early case studies and illustrative examples demonstrating how Gaussian Process Emulators are enabling construction of individual patient heart models from clinical data.



Bayesian Surrogate Modelling of Computer Experiments using Gaussian Processes

Abstract

Computer simulations/experiments of complicated physical processes are a cost-effective way of investigating input-output relationships. Though they provide an efficiency gain when compared to in-situ or laboratory experiments, computer simulations can still be very time- and computationally expensive. Surrogate models can alleviate this expense and Gaussian processes have recently been widely used in emulating computer experiments in particular. In this work, Gaussian processes are used to emulate deterioration of large infrastructure assets, such as transport cuttings. The emulation allows to approximate the relationship between deterioration and earthwork properties, e.g. height, slope angle, and soil strength. As the resulting Gaussian model is hierarchical, it can have hundreds of parameters that need to be optimised for which we use Bayesian modelling. The resulting model can be used in real time to estimate the extent of earthwork deterioration and has the potential to inform slope maintenance and design.



Identifying Dynamic Systems for Digital Twins of Engineering Assets

Abstract

The pursuit of greater levels of virtualisation and integration of data and models in engineering is currently expressed as the desire to construct a digital twin of an asset. This twin is then used to inform operational and maintenance decisions, assist in design and planning and provide greater insight into performance. One key challenge is ensuring that the twin accurately reproduces the behaviour of the physical structure. This aim can be approached via physical modelling, data-driven methods or a combination of the two. This talk will focus on one task within the wider framework, that of identifying dynamic systems (those governed by ordinary or partial differential equations) from data whilst respecting prior knowledge about the system. The tool of choice will be the Gaussian process (GP) which provides a flexible Bayesian framework in which to conduct this identification. Challenges are associated with the use of the GP, namely that in its standard form it will only learn *static* maps from inputs to outputs so cannot be directly applied to dynamic data. It will be shown how, in the setting of dynamic system identification, the GP can still be a powerful and valuable tool. Of particular interest, will be how the GP allows for encoding of engineering knowledge in the structure of the model to allow for more efficient identification than a purely black-box approach.



Gaussian process emulation to rigorously explore uncertainty in complex models of the atmosphere and climate

Abstract

Large computer models are used to simulate our knowledge of complex systems like the atmosphere and climate, with the aim to better understand and predict the system’s behaviour, how it might evolve and how it could respond to future changes. However, these models are inherently uncertain with many uncertain inputs (parameters), and we need to understand how this uncertainty can affect our predictions. In this presentation, I’ll describe a statistical framework that utilises Gaussian Process emulators (surrogate models) alongside a variety of other statistical techniques to examine the effects of parameter uncertainties on outputs from complex models. We will see examples of the application of this approach with real models from atmospheric science. In particular, I’ll show results from a large study of the Met Office’s climate model in which we explore the effect of parameter uncertainties on predictions of how aerosols (tiny particles in the atmosphere) have affected the Earth's energy balance since pre-industrial times, known as the 'aerosol radiative forcing', and describe our efforts to use observations from ship campaigns, flight campaigns and ground stations to try to reduce this uncertainty. We’ll also consider some of the challenges in the practical application of this kind of approach with real-world models.



Modelling London’s Air Quality Using Spatio-Temporal Gaussian Processes

Abstract

Air pollution is one of the greatest global issues to health. In cities like London, it has become a major topic of concern, with many policies and interventions implemented to manage the problem. However, there is a big demand for accurate data-driven models that can provide real-time information and short-term forecasts. In this talk, I describe an approach for constructing such models using Gaussian processes. In particular, I will focus on a scalable approach to Gaussian process inference that combines spatio-temporal filtering with natural gradient variational inference, resulting in a non-conjugate GP method for multivariate data that scales linearly with respect to time. This leads to an efficient and accurate method for large spatio-temporal problems that I will demonstrate on London’s Air Quality.