On a Connection between Importance Sampling and the Likelihood Ratio Policy Gradient

author: Jie Tang, Department of Electrical Engineering and Computer Sciences, UC Berkeley
published: March 25, 2011,   recorded: December 2010,   views: 3520
Categories

Slides

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Delicious Bibliography

Description

Likelihood ratio policy gradient methods have been some of the most successful reinforcement learning algorithms, especially for learning on physical systems. We describe how the likelihood ratio policy gradient can be derived from an importance sampling perspective. This derivation highlights how likelihood ratio methods under-use past experience by (a) using the past experience to estimate the gradient of the expected return at the current policy parameterization, rather than to obtain a more complete estimate, and (b) using past experience under the current policy rather than using all past experience to improve the estimates. We present a new policy search method, which leverages both of these observations as well as generalized baselines - a new technique which generalizes commonly used baseline techniques for policy gradient methods. Our algorithm outperforms standard likelihood ratio policy gradient algorithms on several testbeds.

See Also:

Download slides icon Download slides: nips2010_tang_cbi_01.pdf (129.8 KB)

Download article icon Download article: nips2010_0796.pdf (493.0 KB)

Download subtitles Download subtitles: TT/XML, RT, SRT


Help icon Streaming Video Help

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: