The Limit of One-Class SVM
author: Regis Vert,
University of Paris-Sud 11
published: Feb. 25, 2007, recorded: October 2005, views: 9840
published: Feb. 25, 2007, recorded: October 2005, views: 9840
Slides
Related content
Report a problem or upload files
If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Description
In this talk, I will present an analysis of the asymptotic behaviour of the One-Class support vector machine (SVM), a popular algorithm for outlier detection. I will show that One-Class SVM asymptotically estimates a truncated version of the density of the distribution generating the data, in the case where the Gaussian kernel is used with a well-calibrated decreasing bandwidth parameter, and the regularization parameter involved in the algorithm is held fixed as the training sample size goes to infinity.A long version of this work can be found at www.lri.fr/vert/Publi/regularizeGaussianKernel.ps , in which extensions to the 2-class case and to more general convex loss functions are considered.
Link this page
Would you like to put a link to this lecture on your homepage?Go ahead! Copy the HTML snippet !
Reviews and comments:
Any chance of getting this in a non Windows format?
Write your own review or comment: