An Efficient Sparse Metric Learning in High-Dimensional Space via L1-Penalized Log-Determinant Regularization

author: Guo-Jun Qi, Department of Electrical and Computer Engineering, University of Illinois at Urbana-Champaign
published: Sept. 17, 2009,   recorded: June 2009,   views: 4641
Categories

Slides

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Delicious Bibliography

Description

This paper proposes an efficient sparse metric learning algorithm in high dimensional space via an $\ell_1$-penalized log-determinant regularization. Compare to the most existing distance metric learning algorithms, the proposed algorithm exploits the sparsity nature underlying the intrinsic high dimensional feature space. This sparsity prior of learning distance metric serves to regularize the complexity of the distance model especially in the ``less example number $p$ and high dimension $d$" setting. Theoretically, by analogy to the covariance estimation problem, we find the proposed distance learning algorithm has a consistent result at rate $\mathcal O\left(\sqrt{\left( {m^2 \log d} \right)/n}\right)$ to the target distance matrix with at most $m$ nonzeros per row. Moreover, from the implementation perspective, this $\ell_1$-penalized log-determinant formulation can be efficiently optimized in a block coordinate descent fashion which is much faster than the standard semi-definite programming which has been widely adopted in many other advanced distance learning algorithms. We compare this algorithm with other state-of-the-art ones on various datasets and competitive results are obtained.

See Also:

Download slides icon Download slides: icml09_qi_esml_01.ppt (1.0 MB)


Help icon Streaming Video Help

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: