ABSTRACT
Differential Privacy offers ways to answer statistical queries about sensitive data while providing strong provable privacy guarantees ensuring that the presence or absence of a single individual in the data has a negligible statistical effect on the query's result. In this talk I will introduce the basics of differential privacy and some of the fundamental mechanisms for building differentially private programs. I will then overview few different language-based approaches developed to help a programmer to certify her programs differentially private and to guarantee that they provide accurate answers.
Index Terms
- Formal Verification of Differential Privacy
Recommendations
Formal Verification of Differential Privacy for Interactive Systems (Extended Abstract)
Differential privacy is a promising approach to privacy preserving data analysis with a well-developed theory for functions. Despite recent work on implementing systems that aim to provide differential privacy, the problem of formally verifying that ...
A Novel Differential Privacy Approach that Enhances Classification Accuracy
C3S2E '16: Proceedings of the Ninth International C* Conference on Computer Science & Software EngineeringIn the recent past, there has been a tremendous increase of large repositories of data, examples being in healthcare data, consumer data from retailers, and airline passenger data. These data are continually being shared with interested parties, either ...
Sensitive Disclosures under Differential Privacy Guarantees
BIGDATACONGRESS '15: Proceedings of the 2015 IEEE International Congress on Big DataNon-independent reasoning (NIR) refers to learning the information of one record from other records, under the assumption that these records share the same underlying distribution. Accurate NIR could disclose private information of an individual. An ...
Comments