Learning Multi-Linear Representations of Probability Distributions for Efficient Inference

author: Rajhans Samdani, University of Illinois at Urbana-Champaign
published: Oct. 20, 2009,   recorded: September 2009,   views: 2826

Slides

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Delicious Bibliography

Description

We examine the class of multi-linear polynomial representations (MLR) for expressing probability distributions over discrete variables. Recently, MLR have been considered as intermediate representations that facilitate inference in distributions represented as graphical models. We show that MLR is an expressive representation of discrete distributions and can be used to concisely represent classes of distributions which have exponential size in other commonly used representations, while supporting probabilistic inference in time linear in the size of the representation. Our key contribution is presenting techniques for learning bounded-size distributions represented using MLR, which support efficient probabilistic inference. We propose algorithms for exact and approximate learning for MLR and, through a comparison with Bayes Net representations, demonstrate experimentally that MLR representations provide faster inference without sacrificing inference accuracy.

See Also:

Download slides icon Download slides: ecmlpkdd09_samdani_lmlr_01.ppt (1.9 MB)


Help icon Streaming Video Help

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: