Minimax rates for memory-bounded sparse linear regression

author: Jacob Steinhardt, Computer Science Department, Stanford University
published: Aug. 20, 2015,   recorded: July 2015,   views: 2269
Categories

Slides

Related content

Report a problem or upload files

If you have found a problem with this lecture or would like to send us extra material, articles, exercises, etc., please use our ticket system to describe your request and upload the data.
Enter your e-mail into the 'Cc' field, and we will keep you updated with your request's status.
Lecture popularity: You need to login to cast your vote.
  Delicious Bibliography

Description

We establish a minimax lower bound of $\Omega\p{\frac{kd}{b\epsilon}}$ on the number of samples needed to estimate the parameters in a $k$-sparse linear regression of dimension $d$ given a memory bound of $b$ bits, where $\epsilon$ is the $L_2$ parameter error. When the covariance of the regressors is the identity matrix, we also provide an algorithm that uses $\tilde{O}(b+k)$ bits and requires $\tilde{O}(\frac{kd}{b\epsilon^2})$ samples to achieve error $\epsilon$. Our lower bound also holds in the more general communication-bounded setting, where instead of a memory bound, at most $b$ bits of information are allowed to be (adaptively) communicated about each sample.

See Also:

Download slides icon Download slides: colt2015_steinhardt_linear_regression_01.pdf (321.8 KB)


Help icon Streaming Video Help

Link this page

Would you like to put a link to this lecture on your homepage?
Go ahead! Copy the HTML snippet !

Write your own review or comment:

make sure you have javascript enabled or clear this field: