View order







Type of content

 
 
 
 
 
 

Language

 
 
 
 
 
 
 

Year

From:
To:

 


...Search a Keyword

 
 
event header image

NIPS Workshops 2005 - Whistler   

NIPS Workshop on Kernel Methods and Structured Domains / NIPS Workshop on Large Scale Kernel Machines, Whistler 2005

Kernel Methods and Structured Domains

Substantial recent work in machine learning has focused on the problem of dealing with inputs and outputs on more complex domains than are provided for in the classical regression/classification setting. Structured representations can give a more informative view of input domains, which is crucial for the development of successful learning algorithms: application areas include determining protein structure and protein-protein interaction; part-of-speech tagging; the organization of web documents into hierarchies; and image segmentation. Likewise, a major research direction is in the use of structured output representations, which have been applied in a broad range of areas including several of the foregoing examples (for instance, the output required of the learning algorithm may be a probabilistic model, a graph, or a ranking).

Large Scale Kernel Machines

Datasets with millions of observations can be gathered by crawling the web, mining business databases, or connecting a cheap video tuner to a laptop. Vastly more ambitious learning systems are theoretically possible. The literature shows no shortage of ideas for sophisticated statistical models. The computational cost of learning algorithms is now the bottleneck. During the last decade, dataset size has outgrown processor speed. Meanwhile, machine learning algorithms became more principled, and also more computationally expensive.

Lectures

Write your own review or comment:

make sure you have javascript enabled or clear this field: