Abstract
Static program analysis, once seen primarily as a tool for optimising programs, is now increasingly important as a means to provide quality guarantees about programs. One measure of quality is the extent to which programs respect the privacy of user data. Differential privacy is a rigorous quantified definition of privacy which guarantees a bound on the loss of privacy due to the release of statistical queries. Among the benefits enjoyed by the definition of differential privacy are compositionality properties that allow differentially private analyses to be built from pieces and combined in various ways. This has led to the development of frameworks for the construction of differentially private program analyses which are private-by-construction. Past frameworks assume that the sensitive data is collected centrally, and processed by a trusted curator. However, the main examples of differential privacy applied in practice - for example in the use of differential privacy in Google Chrome’s collection of browsing statistics, or Apple’s training of predictive messaging in iOS 10 -use a purely local mechanism applied at the data source, thus avoiding the collection of sensitive data altogether. While this is a benefit of the local approach, with systems like Apple’s, users are required to completely trust that the analysis running on their system has the claimed privacy properties.
In this position paper we outline some key challenges in developing static analyses for analysing differential privacy, and propose novel abstractions for describing the behaviour of probabilistic programs not previously used in static analyses.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
For convenience, our definition is a rotation by \(90^\circ \) in the unit square of the region defined by [18] and we restrict to the region above \(y=x\) (their definition, after rotation, is symmetric about \(y=x\)).
References
Apple Press Release: Apple previews iOS 10, the biggest iOS release ever (2016). https://www.apple.com/newsroom/2016/06/apple-previews-ios-10-biggest-ios-release-ever. Accessed 22 July 2017
Cousot, P., Cousot, R.: Abstract interpretation: a unified lattice model for static analysis of programs by construction or approximation of fixpoints. In: Proceedings 4th Annual ACM Symposium on Principles of Programming Languages, pp. 238–252 (1977)
Cousot, P., Cousot, R.: Systematic design of program analysis frameworks. In: Proceedings of the 6th ACM SIGACT-SIGPLAN Symposium on Principles of Programming Languages. POPL 1979, pp. 269–282. ACM, New York (1979). https://doi.org/10.1145/567752.567778
Duchi, J.C., Jordan, M.I., Wainwright, M.J.: Local privacy and statistical minimax rates. In: 2013 51st Annual Allerton Conference on Communication, Control, and Computing (Allerton), pp. 1592–1592, October 2013. https://doi.org/10.1109/Allerton.2013.6736718
Dwork, C.: Differential privacy. In: Bugliesi, M., Preneel, B., Sassone, V., Wegener, I. (eds.) ICALP 2006. LNCS, vol. 4052, pp. 1–12. Springer, Heidelberg (2006). https://doi.org/10.1007/11787006_1
Dwork, C., McSherry, F., Nissim, K., Smith, A.: Calibrating noise to sensitivity in private data analysis. In: Halevi, S., Rabin, T. (eds.) TCC 2006. LNCS, vol. 3876, pp. 265–284. Springer, Heidelberg (2006). https://doi.org/10.1007/11681878_14
Dwork, C., Roth, A.: The algorithmic foundations of differential privacy. Found. Trends Theoret. Comput. Sci. 9, 211–407 (2014). https://doi.org/10.1561/0400000042
Ebadi, H.: Dynamic Enforcement of Differential Privacy. Ph.D. thesis, Chalmers University of Technology, March 2018
Ebadi, H.: The PreTPost Framework (2018). https://github.com/ebadi/preTpost
Ebadi, H., Sands, D.: PreTPost: a transparent, user verifiable, local differential privacy framework (2018). https://github.com/ebadi/preTpost. Also appears in [8]
Ebadi, H., Sands, D., Schneider, G.: Differential privacy: now it’s getting personal. In: Proceedings of the 42nd Annual ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages. POPL 2015, pp. 69–81. ACM (2015). https://doi.org/10.1145/2676726.2677005
Erlingsson, Ú., Pihur, V., Korolova, A.: RAPPOR: randomized aggregatable privacy-preserving ordinal response. In: CCS. ACM (2014)
Gaboardi, M., Haeberlen, A., Hsu, J., Narayan, A., Pierce, B.C.: Linear dependent types for differential privacy. In: Proceedings of the 40th Annual ACM SIGPLAN-SIGACT Symposium on Principles of Programming Languages. POPL 2013, pp. 357–370. ACM, New York (2013). https://doi.org/10.1145/2429069.2429113
Giacobazzi, R., Ranzato, F.: Optimal domains for disjunctive abstract interpretation. Sci. Comput. Program. 32(1), 177–210 (1998). https://doi.org/10.1016/S0167-6423(97)00034-8,http://www.sciencedirect.com/science/article/pii/S0167642397000348. 6th European Symposium on Programming
Haeberlen, A., Pierce, B.C., Narayan, A.: Differential privacy under fire. In: Proceedings of the 20th USENIX Conference on Security. SEC 2011, pp. 33–33. USENIX Association, Berkeley (2011). http://dl.acm.org/citation.cfm?id=2028067.2028100
Hunt, S.: Abstract interpretation of functional languages: from theory to practice. Ph.D. thesis, Imperial College London, UK (1991)
Hunt, S., Sands, D.: Binding time analysis: a new perspective. In: Proceedings of the ACM Symposium on Partial Evaluation and Semantics-Based Program Manipulation (PEPM 1991), pp. 154–164. ACM Press (1991)
Kairouz, P., Oh, S., Viswanath, P.: The composition theorem for differential privacy. IEEE Trans. Inf. Theory 63(6), 4037–4049 (2017)
Kairouz, P., Oh, S., Viswanath, P.: Extremal mechanisms for local differential privacy. J. Mach. Learn. Res. 17(17), 1–51 (2016). http://jmlr.org/papers/v17/15-135.html
Landauer, J., Redmond, T.: A lattice of information. In: CSFW (1993)
Malacaria, P.: Algebraic foundations for information theoretical, probabilistic an guessability measures of information flow. CoRR abs/1101.3453 (2011). http://arxiv.org/abs/1101.3453
McSherry, F.: Privacy integrated queries. In: Proceedings of the 2009 ACM SIGMOD International Conference on Management of Data (SIGMOD). Association for Computing Machinery, Inc., June 2009
Proserpio, D., Goldberg, S., McSherry, F.: Calibrating data to sensitivity in private data analysis: a platform for differentially-private analysis of weighted datasets. Proc. VLDB Endow. 7(8), 637–648 (2014). https://doi.org/10.14778/2732296.2732300
Tang, J., Korolova, A., Bai, X., Wang, X., Wang, X.: Privacy loss in Apple’s implementation of differential privacy on MacOS 10.12. CoRR abs/1709.02753 (2017). http://arxiv.org/abs/1709.02753
Zhang, D., McKenna, R., Kotsogiannis, I., Hay, M., Machanavajjhala, A., Miklau, G.: EKTELO: a framework for defining differentially-private computations. In: Proceedings of the 2018 International Conference on Management of Data, SIGMOD Conference 2018, Houston, TX, USA, 10–15 June 2018, pp. 115–130 (2018). https://doi.org/10.1145/3183713.3196921
Zhang, D., Kifer, D.: LightDP: towards automating differential privacy proofs. In: POPL (2017)
Acknowledgements
This work was partly funded by the Swedish Foundation for Strategic Research (SSF) under the projects WebSec and by the Swedish Research Council (VR).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Hunt, S., Sands, D. (2020). New Program Abstractions for Privacy. In: Di Pierro, A., Malacaria, P., Nagarajan, R. (eds) From Lambda Calculus to Cybersecurity Through Program Analysis. Lecture Notes in Computer Science(), vol 12065. Springer, Cham. https://doi.org/10.1007/978-3-030-41103-9_10
Download citation
DOI: https://doi.org/10.1007/978-3-030-41103-9_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-41102-2
Online ISBN: 978-3-030-41103-9
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)