Perturbative Corrections to Expectation Consistent Approximate Inference

author: Manfred Opper, TU Berlin
published: Dec. 31, 2007,   recorded: December 2007,   views: 3706
Categories

Slides

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Delicious Bibliography

Description

Algorithms for approximate inference usually come without any guarantee for the quality of the approximation. Nevertheless, we often find cases where such algorithms perform extremely well on the computation of posterior moments when compared to time consuming (and in the limit exact) MC simulations or exact enumerations.
A prominent example is the Expectation Propagation (EP) algorithm when applied to Gaussian process classification. Can we understand when and why we can trust the approximate results or, if not, how we could obtain systematic improvements?
In this talk, we rederive the fixed point conditions of EP using the ideas of expectation consistency (EC) [1] and explicitly consider the terms neglected in the approximation. We will show how one can derive a formal (asymptotic) power series expansion for this correction and compute its leading terms. We will illustrate the approach for the case of GP classification and for networks of Ising variables.
[1] Expectation Consistent Approximate Inference, Manfred Opper and Ole Winther, JMLR 6, 2177 - 2204 (2005).

See Also:

Download slides icon Download slides: abi07_opper_pce_01.pdf (639.1 KB)


Help icon Streaming Video Help

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: