Online Learning and Bregman Divergences

author: Manfred K. Warmuth, Department of Computer Science, University of California Santa Cruz
published: Feb. 25, 2007,   recorded: July 2006,   views: 11707
Categories

Slides

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Delicious Bibliography

 Watch videos:   (click on thumbnail to launch)

Watch Part 1
Part 1 1:13:44
!NOW PLAYING
Watch Part 2
Part 2 1:04:19
!NOW PLAYING
Watch Part 3
Part 3 1:14:42
!NOW PLAYING

Description

L 1: Introduction to Online Learning (Predicting as good as the best expert, Predicting as good as the best linear combination of experts, Additive versus multiplicative family of updates)
L 2: Bregman divergences and Loss bounds (Introduction to Bregman divergences, Relative loss bounds for the linear case, Nonlinear case & matching losses, Duality and relation to exponential families)
L 3: Extensions, interpretations, applications (Online to Batch Conversions, Prior information on the weight vector, Some applications)

See Also:

Download slides icon Download slides: mlss06tw_warmuth_olbd.pdf (1.3 MB)


Help icon Streaming Video Help

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: