Thursday 9th June
11:15 - 11:45 |
Coffee Break |
13:45 - 14:00 |
Some thoughts on Gaussian Processes |
|
Zoubin Ghahramani, The
Gatsby Institute, University College London, U.K. |
|
|
14:45 - 15:45 |
Towards bridging the gap between transcriptome and proteome
measurement uncertainties with Gaussian processes |
|
Niranjan,
Department of Computer Science, University of Sheffield, U.K. |
|
|
15:45 - 16:15 |
Coffee Break |
16:15 - 17:15 |
Assessing Approximations for Gaussian Process
Classification |
|
Carl
Rasmussen,
Max-Planck Institute, Tuebingen, Germany |
|
Based on joint work with Malte Kuss.
The computations required for Gaussian Process Classification are
analytically intractable. Several approximation schemes have been proposed
recently, but at present it is less than clear how well each of these
perform. I compare the Laplace approximation to the Expectation Propagation
(EP) algorithm on the evaluation of two key quanteties: the marginal
likelihood and the predictive probabilities. I also compare to ground truth,
tortuously obtained by Annealed Importance Sampling (MCMC).
|
Friday 10th June
10:00 - 11:00 |
Joint Gaussian Process-Density Mixtures |
|
Ole Winther,
Technical University of Denmark, Denmark
|
|
Gaussian Processes (GPs) provide a natural framework for Bayesian kernel
methods. This talk will be about some work in progress on combining GPs with
density estimation in a mixture model. The motivations are: using kernels tuned
individually to each mixture component gives a more flexible input-output model,
unlabelled data can be used in a semi-supervised setting and the computational
complexity can be reduced because only examples belonging to the same mixture
component need to be included in the kernel matrix for that mixture component. A
variational Bayes treatment of the joint estimation problem shows how a low
complexity solution can be obtained. A more precise approximation to the
inference problem of the expectation consistent/propagation type is also
possible. |
10:00 - 11:00 |
Expectation Consistent Approximate Inference |
|
Ole Winther,
Technical University of Denmark, Denmark
|
11:00 - 11:30 |
Coffee Break |
15:00 - 16:00 |
Issues and Challenges in On-Line Gaussian Process Estimation |
|
Tony Dodd, Automatic
Control and Systems Engineering, University of Sheffield, U.K. |
|
Gaussian processes (GPs) have been successfully applied to a number of
well-known problems in machine learning, signal processing and function
approximation. Much of the interest in GPs arises from the multiple
interpretations possible: statistical, (Bayesian) probabilistic and
reproducing kernel Hilbert spaces (RKHS). Previous research has focused on
batch learning for GPs where it is assumed all the data is available.
Recently there has been interest in on-line or sequential learning of GPs.
This has applications to incremental solutions for large data set problems,
on-line learning and adaptive non-stationary modelling. Issues and
challenges relating to the use of on-line Gaussian process models in machine
learning will be presented. These will include the need to provably
guarantee convergence, convergence rates and how to compute the models
efficiently. Preliminary results on some of these will be discussed. |