Variational Model Selection for Sparse Gaussian Process Regression

author: Michalis K. Titsias, School of Mathematics, University of Manchester
published: Oct. 9, 2008,   recorded: September 2008,   views: 6149
Categories

Slides

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Delicious Bibliography

Description

Model selection for sparse Gaussian process (GP) models is an important problem that involves the selection of both the inducing/active variables and the kernel parameters. We describe an auxiliary variational method for sparse GP regression that jointly learns the inducing variables and kernel parameters by minimizing the Kullback-Leibler divergence between an approximate distribution and the true posterior over the latent function values. The variational distribution is parametrized using an unconstrained distribution over inducing variables and a conditional GP prior. This framework allows us to compute a lower bound of the true log marginal likelihood which can be reliably maximized over the inducing inputs and the kernel parameters. We will show how we can reformulate several of the most advanced sparse GP methods, such as the subset of data (SD), DTC, FITC and PITC method, based on the above framework.

See Also:

Download slides icon Download slides: bark08_titsias_vmsfsgpr_01.pdf (654.5 KB)


Help icon Streaming Video Help

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: