Limited-memory quasi-Newton and Hessianfree Newton methods for non-smooth optimization

author: Mark Schmidt, Department of Computer Science, University of British Columbia
published: Jan. 13, 2011,   recorded: December 2010,   views: 7876
Categories

Slides

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Delicious Bibliography

Description

Limited-memory quasi-Newton and Hessian-free Newton methods are two workhorses of unconstrained optimization of high-dimensional smooth objectives. However, in many cases we would like to optimize a high-dimensional unconstrained objective function that is non-smooth due to the presence of a ‘simple’ non-smooth regularization term. Motivated by problems arising in estimating sparse graphical models, in this talk we focus on strategies for extending limited-memory quasi- Newton and Hessian-free Newton methods for unconstrained optimization to this scenario. We first consider two-metric (sub-) gradient projection methods for problems where the regularizer is separable, and then consider proximal Newton-like methods for group-separable and non-separable regularizers. We will discuss several applications where sparsity-encouraging regularizers are used to estimate graphical model parameters and/or structure, including the estimation of sparse, blockwise-sparse, and structured-sparse models.

See Also:

Download slides icon Download slides: nipsworkshops2010_schmidt_lmq_01.pdf (1.1 MB)


Help icon Streaming Video Help

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Reviews and comments:

Comment1 Michael, March 14, 2012 at 5:37 a.m.:

Wonderful talk!

Your explanations of HF Learning were invaluable for me putting together a presentation on a paper regarding HF Learning.

Can't thank you enough! Excellent excellent excellent.


Comment2 Hariprasad Kannan, August 7, 2015 at 11:01 p.m.:

Excellent talk, giving a lot of important details. Thanks a lot.

Write your own review or comment:

make sure you have javascript enabled or clear this field: