skip to main content
10.1145/2508859.2516686acmconferencesArticle/Chapter ViewAbstractPublication PagesccsConference Proceedingsconference-collections
research-article

Membership privacy: a unifying framework for privacy definitions

Published: 04 November 2013 Publication History

Abstract

We introduce a novel privacy framework that we call Membership Privacy. The framework includes positive membership privacy, which prevents the adversary from significantly increasing its ability to conclude that an entity is in the input dataset, and negative membership privacy, which prevents leaking of non-membership. These notions are parameterized by a family of distributions that captures the adversary's prior knowledge. The power and flexibility of the proposed framework lies in the ability to choose different distribution families to instantiate membership privacy. Many privacy notions in the literature are equivalent to membership privacy with interesting distribution families, including differential privacy, differential identifiability, and differential privacy under sampling. Casting these notions into the framework leads to deeper understanding of the strengthes and weaknesses of these notions, as well as their relationships to each other. The framework also provides a principled approach to developing new privacy notions under which better utility can be achieved than what is possible under differential privacy.

References

[1]
Directive 95/46/ec of the European Parliament and of the council of 24 october 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data. Official Journal L, 281(23/11):0031--0050, 1995.
[2]
Standard for privacy of individually identifiable health information. Federal Register, 67(157):53 181--53 273, Aug 2002. http://www.hhs.gov/ocr/privacy/hipaa/administrative/privacyrule/index.html.
[3]
M. Barbaro and J. Tom Zeller. A face is exposed for aol searcher no. 4417749. New York Times, Aug 2006.
[4]
A. Blum, C. Dwork, F. McSherry, and K. Nissim. Practical privacy: the SuLQ framework. In PODS, pages 128--138, 2005.
[5]
G. Cormode. Personal privacy vs population privacy: learning to attack anonymization. In KDD, pages 1253--1261, 2011.
[6]
I. Dinur and K. Nissim. Revealing information while preserving privacy. In PODS.
[7]
C. Dwork. Differential privacy. In in ICALP, pages 1--12. Springer, 2006.
[8]
C. Dwork. Differential privacy. In ICALP, pages 1--12, 2006.
[9]
C. Dwork. An ad omnia approach to defining and achieving private data analysis. In PinKDD '07, pages 1--13, Berlin, Heidelberg, 2008. Springer-Verlag.
[10]
C. Dwork, F. Mcsherry, K. Nissim, and A. Smith. Calibrating noise to sensitivity in private data analysis. In In Proceedings of the 3rd Theory of Cryptography Conference, pages 265--284. Springer, 2006.
[11]
C. Dwork, F. McSherry, K. Nissim, and A. Smith. Calibrating noise to sensitivity in private data analysis. In TCC, pages 265--284, 2006.
[12]
C. Dwork and M. Naor. On the difficulties of disclosure prevention in statistical databases or the case for differential privacy. Journal of Privacy and Confidentiality, 2(1):8, 2010.
[13]
C. Dwork and K. Nissim. Privacy-preserving datamining on vertically partitioned databases. In CRYPTO, pages 528--544. Springer, 2004.
[14]
D. Gale. A theorem on flows in networks. Pacific Journal of Mathematics, 7(2):1073--1082, 1957.
[15]
J. Gehrke, M. Hay, E. Lui, and R. Pass. Crowd-blending privacy. In CRYPTO, pages 479--496, 2012.
[16]
J. Gehrke, E. Lui, and R. Pass. Towards privacy for social networks: a zero-knowledge based definition of privacy. In TCC, pages 432--449, Berlin, Heidelberg, 2011. Springer-Verlag.
[17]
N. Homer, S. Szelinger, M. Redman, D. Duggan, W. Tembe, J. Muehling, J. V. Pearson, D. A. Stephan, S. F. Nelson, and D. W. Craig. Resolving individuals contributing trace amounts of DNA to highly complex mixtures using high-density SNP genotyping microarrays. PLoS Genet, 4(8):e1000167
[18]
, 08 2008.
[19]
D. Kifer and B.-R. Lin. Towards an axiomatization of statistical privacy and utility. In PODS, PODS '10, pages 147--158, New York, NY, USA, 2010. ACM.
[20]
D. Kifer and A. Machanavajjhala. No free lunch in data privacy. In SIGMOD, pages 193--204, 2011.
[21]
J. Lee and C. Clifton. Differential identifiability. In KDD, pages 1041--1049, 2012.
[22]
N. Li, T. Li, and S. Venkatasubramanian. t-closeness: Privacy beyond k-anonymity and l-diversity. In ICDE, pages 106--115, 2007.
[23]
N. Li, W. Qardaji, and D. Su. On sampling, anonymization, and differential privacy or, k-anonymization meets differential privacy. In ASIACCS, pages 32--33, 2012.
[24]
A. Machanavajjhala, J. Gehrke, and M. Götz. Data publishing against realistic adversaries. Proc. VLDB Endow., 2(1):790--801, Aug. 2009.
[25]
A. Machanavajjhala, J. Gehrke, D. Kifer, and M. Venkitasubramaniam. $\ell$-diversity: Privacy beyond k-anonymity. In ICDE, page 24, 2006.
[26]
A. Machanavajjhala, D. Kifer, J. M. Abowd, J. Gehrke, and L. Vilhuber. Privacy: Theory meets practice on the map. In ICDE, pages 277--286, 2008.
[27]
F. McSherry and K. Talwar. Mechanism design via differential privacy. In FOCS, pages 94--103, 2007.
[28]
A. Narayanan and V. Shmatikov. Robust de-anonymization of large sparse datasets. In S&P, pages 111--125, 2008.
[29]
K. Nissim, S. Raskhodnikova, and A. Smith. Smooth sensitivity and sampling in private data analysis. In STOC, pages 75--84, 2007.
[30]
P. Samarati. Protecting respondents' identities in microdata release. IEEE Trans. on Knowl. and Data Eng., 13:1010--1027, November 2001.
[31]
L. Sweeney. k-anonymity: A model for protecting privacy. Int. J. Uncertain. Fuzziness Knowl.-Based Syst., 10(5):557--570, 2002.

Cited By

View all
  • (2025)Privacy Auditing in Differential Private Machine Learning: The Current TrendsApplied Sciences10.3390/app1502064715:2(647)Online publication date: 10-Jan-2025
  • (2024)Closed-form bounds for DP-SGD against record-level inference attacksProceedings of the 33rd USENIX Conference on Security Symposium10.5555/3698900.3699170(4819-4836)Online publication date: 14-Aug-2024
  • (2024)MISTProceedings of the 33rd USENIX Conference on Security Symposium10.5555/3698900.3699034(2387-2404)Online publication date: 14-Aug-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
CCS '13: Proceedings of the 2013 ACM SIGSAC conference on Computer & communications security
November 2013
1530 pages
ISBN:9781450324779
DOI:10.1145/2508859
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 04 November 2013

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. differential privacy
  2. membership privacy
  3. privacy notions

Qualifiers

  • Research-article

Conference

CCS'13
Sponsor:

Acceptance Rates

CCS '13 Paper Acceptance Rate 105 of 530 submissions, 20%;
Overall Acceptance Rate 1,261 of 6,999 submissions, 18%

Upcoming Conference

CCS '25

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)125
  • Downloads (Last 6 weeks)11
Reflects downloads up to 17 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Privacy Auditing in Differential Private Machine Learning: The Current TrendsApplied Sciences10.3390/app1502064715:2(647)Online publication date: 10-Jan-2025
  • (2024)Closed-form bounds for DP-SGD against record-level inference attacksProceedings of the 33rd USENIX Conference on Security Symposium10.5555/3698900.3699170(4819-4836)Online publication date: 14-Aug-2024
  • (2024)MISTProceedings of the 33rd USENIX Conference on Security Symposium10.5555/3698900.3699034(2387-2404)Online publication date: 14-Aug-2024
  • (2024)Privacy and Integrity Protection for IoT Multimodal Data Using Machine Learning and BlockchainACM Transactions on Multimedia Computing, Communications, and Applications10.1145/363876920:6(1-18)Online publication date: 8-Mar-2024
  • (2024)Preserving Node-level Privacy in Graph Neural Networks2024 IEEE Symposium on Security and Privacy (SP)10.1109/SP54263.2024.00270(4714-4732)Online publication date: 19-May-2024
  • (2024)Budget Recycling Differential Privacy2024 IEEE Symposium on Security and Privacy (SP)10.1109/SP54263.2024.00212(1028-1046)Online publication date: 19-May-2024
  • (2024)Tackling Privacy Concerns in Correlated Big Data: A Comprehensive Review with Machine Learning Insights2024 IEEE International Students' Conference on Electrical, Electronics and Computer Science (SCEECS)10.1109/SCEECS61402.2024.10482215(1-6)Online publication date: 24-Feb-2024
  • (2024)Unraveling Attacks to Machine-Learning-Based IoT Systems: A Survey and the Open Libraries Behind ThemIEEE Internet of Things Journal10.1109/JIOT.2024.337773011:11(19232-19255)Online publication date: 1-Jun-2024
  • (2024)Privacy enhancing and generalizable deep learning with synthetic data for mediastinal neoplasm diagnosisnpj Digital Medicine10.1038/s41746-024-01290-77:1Online publication date: 20-Oct-2024
  • (2024)Navigating Differential Privacy Constraints in Machine LearningFuture Data and Security Engineering. Big Data, Security and Privacy, Smart City and Industry 4.0 Applications10.1007/978-981-96-0437-1_2(16-30)Online publication date: 27-Nov-2024
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media