The First-Order View of Boosting Methods: Computational Complexity and Connections to Regularization
author: Paul Grigas,
Industrial Engineering and Operations Research Department, UC Berkeley
published: Aug. 26, 2013, recorded: July 2013, views: 3230
published: Aug. 26, 2013, recorded: July 2013, views: 3230
Slides
Related content
Report a problem or upload files
If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Description
Incremental Forward Stagewise Regression (FS") is a statistical algorithm that produces sparse coefficient profiles for linear regression. Using the tools of first-order methods in convex optimization, we analyze the computational complexity of FS" and its flexible variants with adaptive shrinkage parameters. We also show that a simple modication to FS" yields an O(1=k) convergent algorithm for the least squares LASSO t for any regularization parameter and any data-set | thereby quantitatively characterizing the nature of regularization implicitly induced by FS".
Link this page
Would you like to put a link to this lecture on your homepage?Go ahead! Copy the HTML snippet !
Write your own review or comment: