ABSTRACT
We propose truncated concentrated differential privacy (tCDP), a refinement of differential privacy and of concentrated differential privacy. This new definition provides robust and efficient composition guarantees, supports powerful algorithmic techniques such as privacy amplification via sub-sampling, and enables more accurate statistical analyses. In particular, we show a central task for which the new definition enables exponential accuracy improvement.
Supplemental Material
- Martín Abadi, Andy Chu, Ian Goodfellow, H Brendan McMahan, Ilya Mironov, Kunal Talwar, and Li Zhang. 2016. Deep learning with differential privacy. In Proceedings of the 2016 ACM SIGSAC Conference on Computer and Communications Security. Google ScholarDigital Library
- Mitali Bafna and Jonathan Ullman. 2017. The Price of Selection in Differential Privacy. In Proceedings of the 30th Conference on Learning Theory, COLT 2017, Amsterdam, The Netherlands, 7-10 July 2017. 151–168.Google Scholar
- Amos Beimel, Kobbi Nissim, and Uri Stemmer. 2013.Google Scholar
- Private Learning and Sanitization: Pure vs. Approximate Differential Privacy. In APPROX-RANDOM.Google Scholar
- Raghav Bhaskar, Srivatsan Laxman, Adam D. Smith, and Abhradeep Thakurta. 2010. Discovering frequent patterns in sensitive data. In Proceedings of the 16th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Washington, DC, USA, July 25-28, 2010. 503–512. Google ScholarDigital Library
- Mark Bun, Kobbi Nissim, Uri Stemmer, and Salil Vadhan. 2015.Google Scholar
- Differentially private release and learning of threshold functions. In FOCS. IEEE.Google Scholar
- Mark Bun and Thomas Steinke. 2016. Concentrated Differential Privacy: Simplifications, Extensions, and Lower Bounds. In TCC. https://arxiv.org/abs/1605.02065. Google ScholarDigital Library
- Kamalika Chaudhuri, Daniel J. Hsu, and Shuang Song. 2014. The Large Margin Mechanism for Differentially Private Maximization. In Advances in Neural Information Processing Systems 27: Annual Conference on Neural Information Processing Systems 2014, December 8-13 2014, Montreal, Quebec, Canada. 1287–1295. Google ScholarDigital Library
- Anindya De. 2012. Lower Bounds in Differential Privacy. In TCC. org/10.1007/978- 3- 642- 28914- 9_18Google Scholar
- Cynthia Dwork. 2006. Differential Privacy. In Proceedings of the 33rd International Conference on Automata, Languages and Programming - Volume Part II (ICALP’06). Springer-Verlag, Berlin, Heidelberg, 1–12. Google ScholarDigital Library
- Cynthia Dwork, Krishnaram Kenthapadi, Frank McSherry, Ilya Mironov, and Moni Naor. 2006. Our Data, Ourselves: Privacy Via Distributed Noise Generation. In EUROCRYPT. Google ScholarDigital Library
- Cynthia Dwork and Jing Lei. 2009. Differential privacy and robust statistics. In STOC. Google ScholarDigital Library
- Cynthia Dwork, Frank McSherry, Kobbi Nissim, and Adam Smith. 2006. Calibrating Noise to Sensitivity in Private Data Analysis. In TCC. http://repository.cmu. edu/jpc/vol7/iss3/2. Google ScholarDigital Library
- Cynthia Dwork, Moni Naor, Omer Reingold, and Guy N. Rothblum. 2015. Pure Differential Privacy for Rectangle Queries via Private Partitions. In Advances in Cryptology - ASIACRYPT 2015 - 21st International Conference on the Theory and Application of Cryptology and Information Security, Auckland, New Zealand, November 29 - December 3, 2015, Proceedings, Part II. 735–751. Google ScholarDigital Library
- Cynthia Dwork and Guy Rothblum. 2016.Google Scholar
- Concentrated Differential Privacy. CoRR abs/1603.01887 (2016). http://arxiv.org/abs/1603.01887 https://arxiv.org/ abs/1603.01887.Google Scholar
- Cynthia Dwork, Guy N. Rothblum, and Salil P. Vadhan. 2010.Google Scholar
- Boosting and Differential Privacy. In FOCS.Google Scholar
- Moritz Hardt and Kunal Talwar. 2010. On the Geometry of Differential Privacy. In STOC. Google ScholarDigital Library
- Shiva Prasad Kasiviswanathan, Homin K. Lee, Kobbi Nissim, Sofya Raskhodnikova, and Adam D. Smith. 2011.Google Scholar
- What Can We Learn Privately? SIAM J. Comput. 40, 3 (2011), 793–826. Google ScholarDigital Library
- F. McSherry and K. Talwar. 2007. Mechanism Design via Differential Privacy. In FOCS. Google ScholarDigital Library
- Ilya Mironov. 2017. Renyi differential privacy. arXiv preprint arXiv:1702.07476 (2017).Google Scholar
- https://arxiv.org/abs/1702.07476.Google Scholar
- Jack Murtagh and Salil P. Vadhan. 2016.Google Scholar
- The Complexity of Computing the Optimal Composition of Differential Privacy. In TCC.Google Scholar
- Alfréd Rényi. 1961. On Measures of Entropy and Information. In Proceedings of the Fourth Berkeley Symposium on Mathematical Statistics and Probability, Volume 1: Contributions to the Theory of Statistics. University of California Press, Berkeley, Calif., 547–561. http://projecteuclid.org/euclid.bsmsp/1200512181Google Scholar
- Adam Smith. 2009. Differential privacy and the secrecy of the sample. (2009).Google Scholar
- https://adamdsmith.wordpress.com/2009/09/02/samplesecrecy/.Google Scholar
- Thomas Steinke and Jonathan Ullman. 2017. Tight Lower Bounds for Differentially Private Selection. In FOCS. https://arxiv.org/abs/1704.03024.Google Scholar
- T. van Erven and P. Harremos. 2014.Google Scholar
- Rényi Divergence and Kullback-Leibler Divergence. IEEE Transactions on Information Theory 60, 7 (July 2014), 3797–3820.Google Scholar
Index Terms
- Composable and versatile privacy via truncated CDP
Recommendations
A privacy framework: indistinguishable privacy
EDBT '13: Proceedings of the Joint EDBT/ICDT 2013 WorkshopsIn this paper we illustrate a privacy framework named Indistinguishable Privacy. Indistinguishable privacy could be deemed as the formalization of the existing privacy definitions in privacy preserving data publishing as well as secure multi-party ...
Differential Privacy via a Truncated and Normalized Laplace Mechanism
AbstractWhen querying databases containing sensitive information, the privacy of individuals stored in the database has to be guaranteed. Such guarantees are provided by differentially private mechanisms which add controlled noise to the query responses. ...
Enhancing data utility in differential privacy via microaggregation-based k-anonymity
It is not uncommon in the data anonymization literature to oppose the "old" k-anonymity model to the "new" differential privacy model, which offers more robust privacy guarantees. Yet, it is often disregarded that the utility of the anonymized results ...
Comments