Lower Bounds and a Near-Optimal Shrinkage Estimator for Least Squares Using Random Projections | IEEE Journals & Magazine | IEEE Xplore

Lower Bounds and a Near-Optimal Shrinkage Estimator for Least Squares Using Random Projections


Abstract:

We consider optimization using random projections as a statistical estimation problem, where the squared distance between the predictions from the estimator and the true ...Show More

Abstract:

We consider optimization using random projections as a statistical estimation problem, where the squared distance between the predictions from the estimator and the true solution is the error metric. In approximately solving a large-scale least squares problem using Gaussian sketches, we show that the sketched solution has a conditional Gaussian distribution with the true solution as its mean. Firstly, tight worst-case error lower bounds with explicit constants are derived for any estimator using the Gaussian sketch, and the classical sketching is shown to be the optimal unbiased estimator. For biased estimators, the lower bound also incorporates prior knowledge about the true solution. Secondly, we use the James-Stein estimator to derive an improved estimator for the least squares solution using the Gaussian sketch. An upper bound on the expected error of this estimator is derived, which is smaller than the error of the classical Gaussian sketch solution for any given data. The upper and lower bounds match when the SNR of the true solution is known to be small, and the data matrix is well-conditioned. Empirically, this estimator achieves smaller error on simulated and real datasets, and works for other common sketching methods as well.
Published in: IEEE Journal on Selected Areas in Information Theory ( Volume: 1, Issue: 3, November 2020)
Page(s): 660 - 668
Date of Publication: 24 November 2020
Electronic ISSN: 2641-8770

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.