[edit]
Workshop Schedule
09:00 - 09:30 | Coach Pick up from Hilton, Campanile and Caldecott Arms. Be ready at 9:00. |
09:30 - 09:45 | Welcome |
Neil Lawrence |
09:45 - 10:45 | Gaussian Process Basics [slides] |
David MacKay, Department of Physics, University of Cambridge, U.K. | |
"How on earth can a plain old Gaussian distribution be useful for sophisticated regression and machine learning tasks?" |
10:45 - 11:15 | Coffee Break |
11:15 - 12:15 | Interpreting Covariance Functions & Classification [slides] |
Carl Rasmussen, Max-Planck Institute, Tuebingen, Germany |
12:15 - 12:35 | Spotlights of Poster Presentations |
How to choose the covariance for Gaussian process regression independently of the basis
[slides] Matthias O. Franz and Peter V. Gehler, MPI Tuebingen, Germany An Exchanging-based Refinement to Sparse Gaussian Process Regression
Learning RoboCup-Keepaway with Kernels
Sparse Log Gaussian Processes via MCMC for Spatial Epidemiology
|
12:35 - 13:30 | Lunch Break |
13:30 - 14:30 | Eigenfunctions & Approximation Methods [slides] |
Chris Williams, School of Informatics, University of Edinburgh, U.K. |
14:30 - 15:00 | Flexible and efficient Gaussian process models [slides] |
Edward Snelson, Gatsby Computational Neuroscience Unit, University College London, U.K. | |
I will briefly describe our work on the sparse pseudo-input Gaussian
process (SPGP), where we refine the sparse approximation by selecting
`pseudo-inputs' using gradient methods. I will then describe several
extensions to this framework. Firstly we incorporate supervised
dimensionality reduction to deal with high dimensional input
spaces. Secondly we develop a version of the SPGP that can handle
input-dependent noise. These extensions allow GP methods to be applied to
a wider variety of modelling tasks than previously possible.
Joint work with Zoubin Ghahramani. |
15:00 - 15:30 | Analysing Gene Expression Data Using Gaussian Processes [slides] |
Lorenz Wernisch, School of Crystallography, Birkbeck College, U.K. | |
Complex gene regulatory mechanisms ensure the proper functioning of biological cells. New high-throughput experimental techniques, such as microarrays, provide a snapshot of gene expression levels of thousands of genes at the same time. If repeated on a sample of synchronized cells, time-series profiles of gene activity can be obtained. The aim is to reconstruct the complex gene regulatory network underlying these profiles. Genes often influence each other in a nonlinear fashion and with intricate interaction patterns. Linear models are often unsuited to capture such relationships. Gaussian processes, on the other hand, are ideal for representing nonlinear relationships. A particular attraction is the automatic relevance determination effect, removing unused inputs and resulting in sparse gene networks. |
15:30 - 16:00 | Tea Break |
16:00 - 16:30 | Gaussian Process Model for Inferring the Regulatory Activity of Transcription Factor Proteins [slides] |
Guido Sanguinetti, Department of Computer Science, University of Sheffield, U.K. | |
Inferring the concentration of transcription factors' proteins from
the expression levels of target genes is a very active area of
research in computational biology. Usually, the dynamics of the gene
expression levels are modelled using differential equations where the
transcription factor protein concentrations are treated as parameters,
subsequently estimated using MCMC. We show how this inference problem
can be solved more elegantly by placing a GP prior over the latent
functions, obtaining comparable results to the standard MCMC approach
in a fraction of the time.
Joint work with Neil Lawrence and Magnus Rattray. |
16:30 - 16:45 | Spotlights of Poster Presentations |
Gaussian Processes for Principal Component Analysis
[slides] Colin Fyfe, University of Paisley, U.K. Gaussian Processes for Prediction in Intensive Care
Gaussian Processes for Active Sensor Management
|
16:45 - 17:15 | Discussion Time [slides] |
Discussion Panel |
17:30 - 19:00 | Tour of Bletchley Park |
21:30 - 21:45 | Coach back to hotels. |
Tuesday 13 June
09:00 - 09:30 | Coach Pick up from Hilton, Campanile and Premier Travel Inn (Caldecott Arms). Be ready at 9:00. |
09:30 - 10:30 | Learning Human Pose and Motion Models for Animation [slides] |
Aaron Hertzmann, Department of Computer Science, University of Toronto, Canada | |
Computer animation is an extraordinarily labor-intensive process;
obtaining high-quality motion models could make the process faster and
easier. I will describe methods for learning models of human poses
and motion from motion capture data. I will begin with a pose model
based on the Gaussian Process Latent Variable Model (GPLVM), and the
application of this model to Inverse Kinematics posing. I will then
describe the Gaussian Process Dynamical Model (GPDM) for modeling
motion dynamics. I may also mention a few other extensions to the
GPLVM for modeling motion data. I will discuss the properties of
these models (both good and bad) and potential directions for future
work.
Joint work with David Fleet, Keith Grochow, Steven L. Martin, Zoran Popovic, Jack Wang |
10:30 - 11:00 | Gaussian Processes for Monocular 3D People tracking [slides] |
Raquel Urtasun, Computer Vision Laboratory, EPFL, Switzerland | |
We advocate the use of Gaussian Processes (GPs) to learn prior models of human
pose and motion for 3D people tracking. The Gaussian Process Latent variable
model (GPLVM) provides a low-dimensional embedding of the human pose, and
defines a density function that gives higher probability to poses close to the
training data. The Gaussian Process Dynamical Model (GPDM) provides also a
complex dynamical model in terms of another GP. With the use of Bayesian model
averaging both GPLVM and GPDM can be learned from relatively small amounts of
training data, and they generalize gracefully to motions outside the training
set. We show that such priors are effective for tracking a range of human
walking styles, despite weak and noisy image measurements and a very simple
image likelihood. Tracking is formulated in terms of a MAP estimator on short
sequences of poses within a sliding temporal window.
Joint work with Jack Wang, David Fleet, Aaron Hertzmann and Pascal Fua |
11:00 - 11:30 | Coffee Break |
11:30 - 12:00 | Gaussian Process Implicit Surfaces [slides] |
Oliver Williams, Microsoft Research, Cambridge, U.K. | |
Many applications in computer vision and computer graphics require the
definition of curves and surfaces. Implicit surfaces are a popular
choice for this because they are smooth, can be appropriately
constrained by known geometry, and require no special treatment for
topology changes. In this paper we use Gaussian processes for this and
derive a covariance function equivalent to the thin plate spline
regularizer which has desirable properties for shape modelling. We
demonstrate our approach for both 2D curves and 3D surfaces. The benefit
of using a Gaussian process for this is the meaningful probabilistic
representation of the function.
Joint work with Andrew Fitzgibbon. |
12:00 - 12:30 | Minimum Likelihood Image Feature and Scale Detection Based on the Brownian Image Model [slides] |
Kim S. Pedersen, The Image Group, IT University of Copenhagen, Denmark | |
We present a novel approach to image feature and scale detection based
on the fractional Brownian image model in which images are
realisations of a Gaussian random process on the plane. Image features
are points of interest usually sparsely distributed in images. We
propose to detect such points and their intrinsic scale by detecting
points in scale-space that locally minimises the likelihood under the
model.
Joint work with Peter van Dorst and Marco Loog. |
12:30 - 13:30 | Lunch Break |
13:30 - 14:00 | Demonstration of the Colossus Mark II Computer |
Tony Sale, Bletchley Park |
14:00 - 14:30 | Wifi Localization with Gaussian Processes [slides] |
Brian Ferris, Department of Computer Science and Engineering, University of Washington, U.S.A. | |
Estimating the location of a mobile device from wireless signal strength is an interesting research problem, especially given the complexity of signal propagation through space in the presence of obstacles such as buildings, walls, or people. Gaussian processes have already been used to solve such signal strength localization problems. We extend this work to indoor WiFi localization and present novel kernel functions which increase the accuracy of the Gaussian process model, especially when faced with sparse training data. We additionally present preliminary results of simultaneous mapping and localization using Gaussian process latent variable modeling.
Joint work with Dieter Fox. |
14:30 - 15:00 | Learning to Control an Octopus Arm with Gaussian Process Temporal Difference Methods [slides] |
Yaakov Engel, Department of Computing Science, University of Alberta, Canada | |
The Octopus arm is a highly versatile and complex limb. How the
Octopus controls such a hyper-redundant arm (not to mention eight of
them!) is as yet unknown. Robotic arms based on the same mechanical
principles may render present day robotic arms obsolete. In this talk,
I will describe how we tackle this problem using an online
reinforcement learning algorithm, based on a Bayesian approach to
policy evaluation known as Gaussian process temporal difference (GPTD)
learning.
Our substitute for the real arm is a computer simulation of a 2-dimensional model of an Octopus arm. Even with the simplifications inherent to this model, the state space we face is a high-dimensional one, for any arm of reasonable size. We apply a GPTD-based algorithm to this domain, and demonstrate its operation on several learning tasks of varying degrees of difficulty. Joint work with Peter Szabo and Dmitry Volkinshtein. |
15:00 - 15:30 | Gaussian Process Approximations of Stochastic Differential Equations [slides] |
Cedric Archambeau, School of Electronics and Computer Science, University of Southampton, U.K. | |
It is well known that certain classes of Gaussian process arise naturally as solutions to stochastic differential equations, for example the Ornstein-Uhlenbeck process arises as the stationary solution of a simple linear stochastic differential equation. In this work we introduce some initial results on the approximation of the solution of general stochastic differential equations by Gaussian processes. We employ a variational framework, where we seek a Gaussian process approximation to the posterior distribution of the state of a system whose dynamics are governed by a stochastic differential equation. The application for this work is approximate inference within stochastic dynamic models, in particular models used in weather forecasting.
Joint work with Dan Cornford, Manfred Opper and John Shawe-Taylor |
15:30 - 16:00 | Tea Break |
16:00 - 17:00 | Discussion Time [slides] |
Discussion Panel |
17:30 - 17:45 | Coach back to hotels and to Milton Keynes and Bletchley Railway Stations. |