ABSTRACT
We consider the problem of privately releasing a low dimensional approximation to a set of data records, represented as a matrix A in which each row corresponds to an individual and each column to an attribute. Our goal is to compute a subspace that captures the covariance of A as much as possible, classically known as principal component analysis (PCA). We assume that each row of A has ℓ2 norm bounded by one, and the privacy guarantee is defined with respect to addition or removal of any single row. We show that the well-known, but misnamed, randomized response algorithm, with properly tuned parameters, provides nearly optimal additive quality gap compared to the best possible singular subspace of A. We further show that when ATA has a large eigenvalue gap -- a reason often cited for PCA -- the quality improves significantly. Optimality (up to logarithmic factors) is proved using techniques inspired by the recent work of Bun, Ullman, and Vadhan on applying Tardos's fingerprinting codes to the construction of hard instances for private mechanisms for 1-way marginal queries. Along the way we define a list culling game which may be of independent interest.
By combining the randomized response mechanism with the well-known following the perturbed leader algorithm of Kalai and Vempala we obtain a private online algorithm with nearly optimal regret. The regret of our algorithm even outperforms all the previously known online non-private algorithms of this type. We achieve this better bound by, satisfyingly, borrowing insights and tools from differential privacy!
Supplemental Material
- Dimitris Achlioptas and Frank Mcsherry. Fast computation of low-rank matrix approximations. Journal of the ACM (JACM), 54, 2007. Google ScholarDigital Library
- Peter N. Belhumeur, João P Hespanha, and David Kriegman. Eigenfaces vs. fisherfaces: Recognition using class specific linear projection. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 19(7):711--720, 1997. Google ScholarDigital Library
- J. L. Bentley and J. B. Saxe. Decomposable searching problems i: Static-to-dynamic trnasformation. J. Algorithms, 1, 1980.Google Scholar
- Avrim Blum, Cynthia Dwork, Frank McSherry, and Kobbi Nissim. Practical privacy: the sulq framework. In PODS, 2005. Google ScholarDigital Library
- Dan Boneh and James Shaw. Collusion-Secure Fingerprinting for Digital Data. IEEE Transactions on Information Theory, 44:1897--1905, 1998. Google ScholarDigital Library
- Mark Bun, Jonathan Ullman, and Salil Vadhan. Fingerprinting codes and the price of approximate differential privacy. In These Proceedings.Google Scholar
- Emmanuel J Candès and Benjamin Recht. Exact matrix completion via convex optimization. Foundations of Computational mathematics, 2009. Google ScholarDigital Library
- Emmanuel J Candès and Terence Tao. The power of convex relaxation: Near-optimal matrix completion. IEEE Transactions on Information Theory, 2010. Google ScholarDigital Library
- Kamalika Chaudhuri, Anand D. Sarwate, and Kaushik Sinha. Near-optimal differentially private principal components. In NIPS, 2012.Google Scholar
- Chandler Davis and William Morton Kahan. The rotation of eigenvectors by a perturbation. III. SIAM Journal on Numerical Analysis, 1970.Google Scholar
- C. Dwork and A. Roth. Algorithmic foundations of differential privacy, 2014. Monograph in preparation.Google Scholar
- Cynthia Dwork, Krishnaram Kenthapadi, Frank McSherry, Ilya Mironov, and Moni Naor. Our data, ourselves: Privacy via distributed noise generation. In EUROCRYPT, pages 486--503, 2006. Google ScholarDigital Library
- Cynthia Dwork and Jing Lei. Differential privacy and robust statistics. In Symp. Theory of Computing (STOC), pages 371--380, 2009. Google ScholarDigital Library
- Cynthia Dwork, Frank McSherry, Kobbi Nissim, and Adam Smith. Calibrating noise to sensitivity in private data analysis. In Theory of Cryptography Conference, pages 265--284. Springer, 2006. Google ScholarDigital Library
- Cynthia Dwork, Moni Naor, Toniann Pitassi, and Guy N Rothblum. Differential privacy under continual observation. In Proceedings of the 42nd ACM symposium on Theory of computing, pages 715--724. ACM, 2010. Google ScholarDigital Library
- Cynthia Dwork, Moni Naor, Omer Reingold, Guy Rothblum, and Salil Vadhan. On the complexity of differentially private data release: efficient algorithms and hardness results. In STOC, pages 381--390, 2009. Google ScholarDigital Library
- Moritz Hardt and Aaron Roth. Beating randomized response on incoherent matrices. In STOC, 2012. Google ScholarDigital Library
- Moritz Hardt and Aaron Roth. Beyond worst-case analysis in private singular vector computation. In STOC, 2013. Google ScholarDigital Library
- Elad Hazan, Satyen Kale, and Manfred K Warmuth. Corrigendum to "learning rotations with little regret" september 7, 2010. 2010. Google ScholarDigital Library
- Adam Kalai and Santosh Vempala. Efficient algorithms for online decision problems. Journal of Computer and System Sciences, 2005. Google ScholarDigital Library
- Michael Kapralov and Kunal Talwar. On differentially private low rank approximation. In SODA, 2013. Google ScholarDigital Library
- Jyrki Kivinen and Manfred K Warmuth. Exponentiated gradient versus gradient descent for linear predictors. Information and Computation, 132(1):1--63, 1997. Google ScholarDigital Library
- F. McSherry and I. Mironov. Differentially private recommender systems: building privacy into the net. In Symp. Knowledge Discovery and Datamining (KDD), pages 627--636. ACM New York, NY, USA, 2009. Google ScholarDigital Library
- Frank McSherry. Spectral methods for data analysis. 2004.Google ScholarDigital Library
- Mehryar Mohri and Ameet Talwalkar. Can matrix coherence be efficiently and accurately estimated? In International Conference on Artificial Intelligence and Statistics, 2011.Google Scholar
- Benjamin Recht. A simpler approach to matrix completion. The Journal of Machine Learning Research, 2011. Google ScholarDigital Library
- Shai Shalev-Shwartz. Online learning and online convex optimization. Foundations and Trends® in Machine Learning, 2011. Google ScholarDigital Library
- Adam Smith and Abhradeep Thakurta. Personal Communication, 2013.Google Scholar
- Ameet Talwalkar and Afshin Rostamizadeh. Matrix coherence and the Nystrom method. arXiv preprint arXiv:1004.2008, 2010.Google Scholar
- Terence Tao. Topics in random matrix theory, volume 132. AMS Bookstore, 2012.Google ScholarCross Ref
- Gábor Tardos. Optimal probabilistic fingerprint codes. J. ACM, 55(2), 2008. Google ScholarDigital Library
- Manfred K Warmuth and Dima Kuzmin. Randomized online PCA algorithms with regret bounds that are logarithmic in the dimension. Journal of Machine Learning Research, 2008.Google Scholar
- John Wright, Allen Y Yang, Arvind Ganesh, Shankar S Sastry, and Yi Ma. Robust face recognition via sparse representation. Pattern Analysis and Machine Intelligence, IEEE Transactions on, 31(2):210--227, 2009. Google ScholarDigital Library
Index Terms
- Analyze gauss: optimal bounds for privacy-preserving principal component analysis
Recommendations
Re-analyze gauss: bounds for private matrix approximation via dyson brownian motion
NIPS '22: Proceedings of the 36th International Conference on Neural Information Processing SystemsGiven a symmetric matrix M and a vector λ, we present new bounds on the Frobenius-distance utility of the Gaussian mechanism for approximating M by a matrix whose spectrum is λ, under (ε, δ-differential privacy. Our bounds depend on both λ and the gaps ...
Convergence analysis of the preconditioned Gauss-Seidel method for H-matrices
In 1997, Kohno et al. [Toshiyuki Kohno, Hisashi Kotakemori, Hiroshi Niki, Improving the modified Gauss-Seidel method for Z-matrices, Linear Algebra Appl. 267 (1997) 113-123] proved that the convergence rate of the preconditioned Gauss-Seidel method for ...
A new method for computing Moore-Penrose inverse through Gauss-Jordan elimination
Performing elementary row operations on [ A | I ] , we can calculate matrices whose columns form bases for N ( A ) and N ( A ) easily. These matrices are then used to construct a bordered matrix through which the Moore-Penrose inverse A of a general ...
Comments