Advanced Statistical Learning Theory

author: Olivier Bousquet, Google, Inc.
published: Feb. 25, 2007,   recorded: September 2004,   views: 10765
Categories

Slides

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Delicious Bibliography

 Watch videos:   (click on thumbnail to launch)

Watch Part 1
Part 1 55:44
!NOW PLAYING
Watch Part 2
Part 2 39:06
!NOW PLAYING
Watch Part 3
Part 3 1:09:05
!NOW PLAYING

Description

This set of lectures will complement the statistical learning theory course and focus on recent advances in the domain of classification. 1- PAC Bayesian bounds: a simple derivation, comparison with Rademacher averages.
2 - Local Rademacher complexity with classification loss, Talagrand's inequality. Tsybakov noise conditions.
3 - Properties of loss functions for classification (influence on approximation and estimation, relationship with noise conditions).
4 - Applications to SVM - Estimation and approximation properties, role of eigenvalues of the Gram matrix.

See Also:

Download slides icon Download slides: mlss04_bousquet_aslt_01.pdf (384.4 KB)


Help icon Streaming Video Help

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: