View order







Type of content

 
 
 
 
 
 

Language

 
 
 
 
 
 
 

Year

From:
To:

 


...Search a Keyword

 
 
event header image

NIPS Workshop 2008 - Whistler   

NIPS Workshop on Optimization for Machine Learning, Whistler 2008

Classical optimization techniques have found widespread use in machine learning. Convex optimization has occupied the center-stage and significant effort continues to be still devoted to it. New problems constantly emerge in machine learning, e.g., structured learning and semi-supervised learning, while at the same time fundamental problems such as clustering and classification continue to be better understood. Moreover, machine learning is now very important for real-world problems with massive datasets, streaming inputs, the need for distributed computation, and complex models.

These challenging characteristics of modern problems and datasets indicate that we must go beyond the ""traditional optimization"" approaches common in machine learning. What is needed is optimization ""tuned"" for machine learning tasks. For example, techniques such as non-convex optimization (for semi-supervised learning, sparsity constraints), combinatorial optimization and relaxations (structured learning), stochastic optimization (massive datasets), decomposition techniques (parallel and distributed computation), and online learning (streaming inputs) are relevant in this setting. These techniques naturally draw inspiration from other fields, such as operations research, polyhedral combinatorics, theoretical computer science, and the optimization community.

More information about workshop - http://opt2008.kyb.tuebingen.mpg.de/

Write your own review or comment:

make sure you have javascript enabled or clear this field: