Auxillary Variational Information Maximization for Dimensionality Reduction
published: Feb. 25, 2007, recorded: February 2005, views: 443
Report a problem or upload filesIf you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Mutual Information (MI) is a long studied measure of in- formation content, and many attempts to apply it to feature extraction and stochastic coding have been made. However, in general MI is com- putationally intractable to compute, and most previous studies redefine the criterion in forms of approximations. Recently we described proper- ties of a simple lower bound on MI , and discussed its links to some of the popular dimensionality reduction techniques. Here we introduce a richer family of the auxiliary variational bounds on MI, which gener- alize our previous approximations. Our specific focus then is on apply- ing the bound to extracting informative lower-dimensional projections in the presence of irreducible Gaussian noise. We show that our method produces significantly tighter bounds on MI compared with the as-if Gaussian approximation . We also show that learning projections to multinomial auxiliary spaces may facilitate reconstructions of the sources from noisy lower-dimensional representations.
Link this pageWould you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !