Auxillary Variational Information Maximization for Dimensionality Reduction

author: David Barber, Centre for Computational Statistics and Machine Learning, University College London
published: Feb. 25, 2007,   recorded: February 2005,   views: 443

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Delicious Bibliography

Description

Mutual Information (MI) is a long studied measure of in- formation content, and many attempts to apply it to feature extraction and stochastic coding have been made. However, in general MI is com- putationally intractable to compute, and most previous studies redefine the criterion in forms of approximations. Recently we described proper- ties of a simple lower bound on MI [2], and discussed its links to some of the popular dimensionality reduction techniques. Here we introduce a richer family of the auxiliary variational bounds on MI, which gener- alize our previous approximations. Our specific focus then is on apply- ing the bound to extracting informative lower-dimensional projections in the presence of irreducible Gaussian noise. We show that our method produces significantly tighter bounds on MI compared with the as-if Gaussian approximation [7]. We also show that learning projections to multinomial auxiliary spaces may facilitate reconstructions of the sources from noisy lower-dimensional representations.

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: