Thursday 9th June

09:00 - 9:15 Welcome
09:15 - 10:15 Gaussian Processes I Have Known
  Tony O'Hagan, Department of Probability and Statistics, University of Sheffield, U.K.
   
10:15 - 11:15 Approximate Methods for GP Regression: A Survey and an Empirical Comparison
  Chris Williams, School of Informatics, University of Edinburgh
   
11:15 - 11:45 Coffee Break
11:45 - 12:45 Nonparametric Bayesian Models in Machine Learning
  Kai Yu, Siemens AG, Germany
   
12:45 - 13:45 Lunch
13:45 - 14:00

Some thoughts on Gaussian Processes

  Zoubin Ghahramani, The Gatsby Institute, University College London, U.K.
   
14:00 - 14:45

Sparse Parametric Gaussian Processes

  Ed Snelson, The Gatsby Institute, University College London, U.K.
   
14:45 - 15:45 Towards bridging the gap between transcriptome and proteome measurement uncertainties with Gaussian processes
  Niranjan, Department of Computer Science, University of Sheffield, U.K.
   
15:45 - 16:15 Coffee Break
16:15 - 17:15 Assessing Approximations for Gaussian Process Classification
  Carl Rasmussen, Max-Planck Institute, Tuebingen, Germany
  Based on joint work with Malte Kuss.

The computations required for Gaussian Process Classification are analytically intractable. Several approximation schemes have been proposed recently, but at present it is less than clear how well each of these perform. I compare the Laplace approximation to the Expectation Propagation (EP) algorithm on the evaluation of two key quanteties: the marginal likelihood and the predictive probabilities. I also compare to ground truth, tortuously obtained by Annealed Importance Sampling (MCMC).

 

Friday 10th June

9:00 - 9:30 KL Corrected Variational Inference for Gaussian Processes
 

Neil Lawrence, Department of Computer Science, University of Sheffield, U.K.

   
9:30 - 10:00 Resampling PCA & GP Inference
  Manfred Opper, School of Electronics and Computer Science, University of Southampton, U.K.
   
10:00 - 11:00 Joint Gaussian Process-Density Mixtures
 

Ole Winther, Technical University of Denmark, Denmark

  Gaussian Processes (GPs) provide a natural framework for Bayesian kernel methods. This talk will be about some work in progress on combining GPs with density estimation in a mixture model. The motivations are: using kernels tuned individually to each mixture component gives a more flexible input-output model, unlabelled data can be used in a semi-supervised setting and the computational complexity can be reduced because only examples belonging to the same mixture component need to be included in the kernel matrix for that mixture component. A variational Bayes treatment of the joint estimation problem shows how a low complexity solution can be obtained. A more precise approximation to the inference problem of the expectation consistent/propagation type is also possible.
10:00 - 11:00 Expectation Consistent Approximate Inference
 

Ole Winther, Technical University of Denmark, Denmark

11:00 - 11:30 Coffee Break
11:30 - 12:30 Requirements for GPC in the Real World
  Anton Schwaighofer, GMD First
   
12:30 - 14:00 Lunch
14:00 - 15:00 Sparsity in Gaussian Processes: Questions
  Lehel Csato, Max-Planck Institute, Tuebingen, Germany
   
15:00 - 16:00 Issues and Challenges in On-Line Gaussian Process Estimation
  Tony Dodd, Automatic Control and Systems Engineering, University of Sheffield, U.K.
  Gaussian processes (GPs) have been successfully applied to a number of well-known problems in machine learning, signal processing and function approximation. Much of the interest in GPs arises from the multiple interpretations possible: statistical, (Bayesian) probabilistic and reproducing kernel Hilbert spaces (RKHS). Previous research has focused on batch learning for GPs where it is assumed all the data is available. Recently there has been interest in on-line or sequential learning of GPs. This has applications to incremental solutions for large data set problems, on-line learning and adaptive non-stationary modelling. Issues and challenges relating to the use of on-line Gaussian process models in machine learning will be presented. These will include the need to provably guarantee convergence, convergence rates and how to compute the models efficiently. Preliminary results on some of these will be discussed.
16:00 - 16:30 Tea
16:30 - 17:30 Some concerns about computationally efficient approximations to GPs
  Joaquin Quinonero Candela, Max-Planck Institute, Tuebingen, Germany