Room: PLB-DS05, Pam Liversidge Building


8:00-9:00 Arrivals

9:00-10:30 Introduction to Bayesian OptimizationJavier Gonzalez, Amazon Research Cambridge [slides]

10:30-11:00 Coffee Break

11:00-12:30 Sparse Gaussian Process ApproximationsRichard Turner, University of Cambridge [slides]

12:30-13:45 Lunch

13:45-15:30 Lab Session 3 - Bayesian optimization with GPyOpt

15:30-16:00 Tea Break

16:00-17:00 Integration over hyperparameters and estimation of predictive performanceAki Vehtari, Aalto University [slides]

Abstracts


Sparse Gaussian Process Approximations

The application of GPs is limited by computational and analytical intractabilities that arise when data are sufficiently numerous or when employing non-Gaussian models. A wealth of GP approximation schemes have been developed over the last 15 years to address these key limitations. Many of these schemes employ a small set of pseudo data points to summarise the actual data. This pseudo data summary transform the dense matrix computations required by GPs into sparse matrix computations enabling acceleration. I will review some of the most important approaches in this vein, focussing on the variational inference approach that has recently revolutionised the deployment of GP-based probabilistic models.