On the relation between Bayesian inference and certain solvable problems of stochastic control

author: Manfred Opper, Department of Artificial Intelligence, TU Berlin
published: Oct. 9, 2008,   recorded: September 2008,   views: 4643
Categories

Slides

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Delicious Bibliography

Description

Optimal control for nonlinear stochastic dynamical systems requires thesolution of a nonlinear PDE, the so - called Hamilton Jacobi Bellman equation.Recently, Bert Kappen and Emanuel Todorov have shown that for certain types of cost functions, this equationcan be transformed to a linear problem which is mathematically related to a Bayesian estimation problem. This has led to novel efficient algorithms for optimal control of such systems. I will show a simple proof for this surprising result and discuss some possible implications.

See Also:

Download slides icon Download slides: bark08_opper_otrbbi_01.pdf (87.6 KB)


Help icon Streaming Video Help

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: