Sparse Coding for Multi-task and Transfer Learning

author: Massimiliano Pontil, Department of Computer Science, University College London
published: Oct. 6, 2014,   recorded: December 2013,   views: 2094
Categories

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Delicious Bibliography

Description

We consider the problem of learning many regression or binary classification tasks simultaneously, under the assumption that the tasks' weight vectors are well approximated as sparse combinations of the atoms of a dictionary. This assumption, together with the large quantity of available tasks, allows for a principled method for choosing the dictionary. We provide theoretical and experimental justifications of this claim, both in the domain of multitask learning, where the learned dictionary is applied to a fixed set of tasks, and in the domain of learning to learn, where the tasks are randomly generated and the learned dictionary is applied to new tasks sampled by the same process. These results also implies that that as number of tasks grow our method matches the performance of the Lasso with best a-priori known dictionary. Finally, we discuss extensions of our method to other coding schemes beyond sparse coding and multilayer networks.

This is joint work with Andreas Maurer and Bernardino Romera-Paredes.

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: