Online Learning and Bregman Divergences
author: Manfred K. Warmuth,
Department of Computer Science, University of California Santa Cruz
published: Feb. 25, 2007, recorded: July 2006, views: 11707
published: Feb. 25, 2007, recorded: July 2006, views: 11707
Slides
Related content
Report a problem or upload files
If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Description
L 1: Introduction to Online Learning (Predicting as good as the best expert, Predicting as good as the best linear combination of experts, Additive versus multiplicative family of updates)
L 2: Bregman divergences and Loss bounds (Introduction to Bregman divergences, Relative loss bounds for the linear case, Nonlinear case & matching losses, Duality and relation to exponential families)
L 3: Extensions, interpretations, applications (Online to Batch Conversions, Prior information on the weight vector, Some applications)
Link this page
Would you like to put a link to this lecture on your homepage?Go ahead! Copy the HTML snippet !
Write your own review or comment: