Years and Authors of Summarized Original Work
2006; Sarlós
Problem Definition
This entry surveys some of the applications of “oblivious subspace embeddings,” introduced by Sarlós in [19], to problems in linear algebra.
Definition 1 ([19])
Given 0 < ɛ < 1∕2 and a d-dimensional subspace \(E \subseteq \mathbb{R}^{n}\), we say an m × n matrix Π is an ɛ-subspace embedding for E if
The goal is to have m small so that Π provides dimensionality reduction for E.
Given 0 < ɛ, δ < 1∕2, and integers 1 ≤ d ≤ n, an (ɛ,δ,d,n)-oblivious subspace embedding (OSE) is a distribution \(\mathcal{D}\) over m × n matrices such that for every d-dimensional linear subspace \(E \subseteq \mathbb{R}^{N}\) of dimension d,
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Recommended Reading
Ailon N, Chazelle B (2009) The fast Johnson–Lindenstrauss transform and approximate nearest neighbors. SIAM J Comput 39(1):302–322
Andoni A, Nguy\(\tilde{\hat{\mbox{ e}}}\) n HL (2013) Eigenvalues of a matrix in the streaming model. In: Proceedings of the 24th annual ACM-SIAM symposium on discrete algorithms (SODA), New Orleans, pp 1729–1737
Avron H, Boutsidis C, Toledo S, Zouzias A (2013) Efficient dimensionality reduction for canonical correlation analysis. In: Proceedings of the 30th international conference on machine learning (ICML), Atlanta, pp 347–355
Boutsidis C, Woodruff DP (2014) Optimal CUR matrix decompositions. In: Proceedings of the 46th ACM symposium on theory of computing (STOC), New York, pp 353–362
Bourgain J, Nelson J (2013) Toward a unified theory of sparse dimensionality reduction in Euclidean space. CoRR abs/1311.2542
Christos Boutsidis, Anastasios Zouzias, Mahoney MW, Drineas P (2011) Stochastic dimensionality reduction for k-means clustering. CoRR abs/1110.2897
Clarkson KL, Woodruff DP (2013) Low rank approximation and regression in input sparsity time. In: Proceedings of the 45th ACM symposium on theory of computing (STOC), Palo Alto, pp 81–90
Clarkson KL, Woodruff DP (2009) Numerical linear algebra in the streaming model. In: Proceedings of the 41st ACM symposium on theory of computing (STOC), Bethesda, pp 205–214
Demmel J, Dumitriu I, Holtz O (2007) Fast linear algebra is stable. Numer Math 108(1):59–91
Drineas P, Mahoney MW, Muthukrishnan S (2006) Subspace sampling and relative-error matrix approximation: column-based methods. In: Proceedings of the 10th international workshop on randomization and computation (RANDOM), Barcelona, pp 316–326
Drineas P, Magdon-Ismail M, Mahoney MW, Woodruff DP (2012) Fast approximation of matrix coherence and statistical leverage. J Mach Learn Res 13:3475–3506
Kane DM, Nelson J (2014) Sparser Johnson-Lindenstrauss transforms. J ACM 61(1):4
Kannan R, Vempala S,Woodruff DP (2014) Principal component analysis and higher correlations for distributed data. In: Proceedings of the 27th annual conference on learning theory (COLT), Barcelona, pp 1040–1057
Lu Y, Dhillon P, Foster D, Ungar L (2013) Faster ridge regression via the subsampled randomized Hadamard transform. In: Proceedings of the 26th annual conference on advances in neural information processing systems (NIPS), Lake Tahoe
Mahoney MW, Meng X (2013) Low-distortion subspace embeddings in input-sparsity time and applications to robust linear regression. In: Proceedings of the 45th ACM symposium on theory of computing (STOC), Palo Alto, pp 91–100
Nelson J, Nguy\(\tilde{\hat{\mbox{ e}}}\) n HL (2013) OSNAP: faster numerical linear algebra algorithms via sparser subspace embeddings. In: Proceedings of the 54th annual IEEE symposium on foundations of computer science (FOCS), Berkeley, pp 117–126
Nelson J, Nguy\(\tilde{\hat{\mbox{ e}}}\) n HL (2014) Lower bounds for oblivious subspace embeddings. In: Proceedings of the 41st international colloquium on automata, languages, and programming, Copenhagen, pp 883–894
Paul S, Boutsidis C, Magdon-Ismail M, Drineas P (2013) Random projections for support vector machines. In: Proceedings of the sixteenth international conference on artificial intelligence and statistics (AISTATS), Scottsdale, pp 498–506
Sarlós T (2006) Improved approximation algorithms for large matrices via random projections. In: Proceedings of the 47th annual IEEE symposium on foundations of computer science (FOCS), Berkeley, pp 143–152
Thorup M, Zhang Y (2012) Tabulation-based 5-independent hashing with applications to linear probing and second moment estimation. SIAM J Comput 41(2):293–331
Tropp JA (2011) Improved analysis of the subsampled randomized Hadamard transform. Adv Adapt Data Anal 3(1–2, special issue, “Sparse Representations of Data and Images,”):115–126
Woodruff DP, Zhang Q (2013) Subspace embeddings and ℓ p -regression using exponential random variables. In: Proceedings of the 26th annual conference on learning theory (COLT), Princeton, pp 546–567
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer Science+Business Media New York
About this entry
Cite this entry
Nelson, J. (2016). Oblivious Subspace Embeddings. In: Kao, MY. (eds) Encyclopedia of Algorithms. Springer, New York, NY. https://doi.org/10.1007/978-1-4939-2864-4_795
Download citation
DOI: https://doi.org/10.1007/978-1-4939-2864-4_795
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4939-2863-7
Online ISBN: 978-1-4939-2864-4
eBook Packages: Computer ScienceReference Module Computer Science and Engineering