View order







Type of content

 
 
 
 
 
 

Language

 
 
 
 
 
 
 

Year

From:
To:

 


...Search a Keyword

 
 
event header image

Sparsity in Machine Learning and Statistics 2009 - Cumberland Lodge   

Workshop on Sparsity in Machine Learning and Statistics, Cumberland Lodge 2009

Sparse estimation (or sparse recovery) is playing an increasingly important role in the statistics and machine learning communities. Several methods have recently been developed in both fields, which rely upon the notion of sparsity (e.g. penalty methods like the Lasso, Dantzig selector, etc.). Many of the key theoretical ideas and statistical analysis of the methods have been developed independently, but there is increasing awareness of the potential for cross-fertilization of ideas between statistics and machine learning.

Furthermore, there are interesting links between lasso-type methods and boosting (particularly, LP-boosting); there has been a renewed interest in sparse Bayesian methods. Sparse estimation is also important in unsupervised method (sparse PCA, etc.). Recent machine learning techniques for multi-task learning and collaborative filtering have been proposed which implement sparsity constraints on matrices (rank, structured sparsity, etc.). At the same time, sparsity is playing an important role in various application fields, ranging from image and video reconstruction and compression, to speech classification, text and sound analysis, etc.

The overall goal of the workshop is to bring together machine learning researchers with statisticians working on this timely topic of research, to encourage exchange of ideas between both communities and discuss further developments and theoretical underpinning of the methods.

For detailed information visit the Workshops website.

Write your own review or comment:

make sure you have javascript enabled or clear this field: