Room: Zoom


8:30-9:00 Arrivals

9:00-10:30 Scalability of Gaussian ProcessesZhenwen Dai, Spotify [slides]

10:30-11:00 Coffee Break

11:00-12:30 GPs for non-Gaussian likelihoodsST John, Finnish Center for Artificial Intelligence/Aalto University [slides]

12:30-14:00 Lunch

14:00-15:30 Lab session 2 and round tables with the speakers (Dr. Zhenwen Dai and Dr. ST John)

15:30-16:00 Coffee Break

16:00-17:00 Multioutput Gaussian ProcessesFelipe Tobar, Universidad de Chile [slides]

Abstracts


Multioutput Gaussian Processes

This talk will focus on the multioutput extension of GPs, also known as multitask GPs or vector-valued GPs. Akin to their scalar-valued counterpart, MOGPs are Bayesian nonparametric generative models for time series, which, in addition to modelling temporal dependencies among data, also account for across-channel relationships. In this regard, the main challenge in MOGPs is the construction of covariance functions that are as capable as possible to identify relationships among different time series while fulfilling the structural properties (e.g., positive definiteness) of the full multioutput covariance. We will start with a motivation for MOGPs and they can be constructed by mixing independent GPs, then, we will revise standard approaches to covariance design and their implications. Lastly, we will present dedicated software for MOGPs with examples and real-world applications.