Skip to main content
Log in

Multi-level privacy analysis of business processes: the Pleak toolset

  • General
  • Regular
  • Published:
International Journal on Software Tools for Technology Transfer Aims and scope Submit manuscript

Abstract

Privacy regulations, such as GDPR, impose strict requirements to organizations that store and process private data. Privacy-enhancing technologies (PETs), such as secure multi-party computation and differential privacy, provide mechanisms to perform computations over private data and to protect the disclosure of private data and derivatives thereof. When PETs are used to protect individual computations or disclosures, their privacy properties and their effect on the utility of the disclosed data can be straightforwardly asserted. However, when multiple PETs are used as part of a complex and possibly inter-organizational business process, it becomes non-trivial for analysts to fully grasp the guarantees that the combined set of PETs provide overall. This article presents a multi-level approach to analyze privacy properties of business processes that rely on PETs to protect private data. The approach is embodied in an open-source toolset, Pleak , that allows analysts to capture privacy-enhanced business process models and to characterize and quantify to what extent the outputs of a process leak information about its inputs. Pleak incorporates an extensible set of analysis plugins, which enable users to inspect potential leakages at multiple levels of detail.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

Notes

  1. We use the term disclosure to refer to some data being made accessible to a party, without implying that this party should or should not have access to these data. On the other hand, we use the term leakage to refer to a disclosure that is likely to be unintentional or forbidden, as it is the result of a misuse of some privacy-enhancing technology, or it violates a policy.

  2. https://pleak.io (account: demo@example.com, password: pleakdemo, manual: https://pleak.io/wiki/, source code: https://github.com/pleak-tools/)

  3. For a brief overview of BPMN and its salient features, see the OMG introduction to BPMN at https://www.omg.org/bpmn/Documents/Introduction_to_BPMN.pdf

  4. Available in Pleak as https://pleak.io/app/#/view/HHQEWDWiweWX2HOmTGU4

  5. This structure is also reflected in our source code in https://github.com/pleak-tools

  6. The Pleak toolset allows users to display these two types of markers in separate reports (a disclosure report and a dependency report), or in a combined report.

  7. https://www.mcrl2.org/.

  8. Available in Pleak as https://pleak.io/app/#/view/5biNWV9UN72HBP6W8CeH

  9. The other four processes are related to: (i) information sharing between countries during a pandemic scenario; (ii) collection and processing of mobile device data during an emergency situation; (iv) collection and processing of room occupancy data in a building; and (iv) privacy-enhanced scheduling of meetings across multiple organizations. The corresponding process models can be found in the cloud deployment of the Pleak toolset available at https://pleak.io (account: demo@example.com, password: pleakdemo).

References

  1. Accorsi, R., Lehmann, A.: Automatic information flow analysis of business process models. In: 10th International Conference on Business Process Management (BPM), pp. 172–187. Springer (2012)

  2. Accorsi, R., Lehmann, A., Lohmann, N.: Information leak detection in business process models: theory, application, and tool support. Inf. Syst. 47, 244–257 (2015)

    Article  Google Scholar 

  3. Alvim, M.S., Chatzikokolakis, K., Palamidessi, C., Smith, G.: Measuring information leakage using generalized gain functions. In: Chong, S. (ed.) 25th IEEE Computer Security Foundations Symposium, CSF 2012, Cambridge, MA, USA, 25–27 June 2012, pp. 265–279. IEEE Computer Society (2012). https://doi.org/10.1109/CSF.2012.26

  4. Armas-Cervantes, A., Baldan, P., Dumas, M., García-Bañuelos, L.: Diagnosing behavioral differences between business process models: an approach based on event structures. Inf. Syst. 56, 304–325 (2016). https://doi.org/10.1016/j.is.2015.09.009

    Article  Google Scholar 

  5. Ayed, G., Ghernaouti-Helie, S.: Processes view modeling of identity-related privacy business interoperability: considering user-supremacy federated identity technical model and identity contract negotiation. In: Proceedings of the ASONAM (2012)

  6. Belluccini, S., Nicola, R.D., Dumas, M., Pullonen, P., Re, B., , Tiezzi, F.: Verification of privacy-enhanced collaborations. In: FormaliSE@ICSE 2020: 8th International Conference on Formal Methods in Software Engineering, Seoul, Republic of Korea, 13 July 2020, pp. 141–152 (2020)

  7. Bhowmick, A., Duchi, J., Freudiger, J., Kapoor, G., Rogers, R.: Protection against reconstruction and its applications in private federated learning (2019). arXiv:1812.00984

  8. Cachin, C.: Entropy measures and unconditional security in cryptography. Ph.D. thesis, ETH Zurich (1997). http://www.d-nb.info/950686247

  9. Colesky, M., Hoepman, J., Hillen, C.: A critical analysis of privacy design strategies. In: IEEE Security and Privacy Workshops (SP), pp. 33–40. IEEE (2016)

  10. Dijkman, R.M., Dumas, M., Ouyang, C.: Semantics and analysis of business process models in BPMN. Inf. Softw. Technol. 50(12), 1281–1294 (2008). https://doi.org/10.1016/j.infsof.2008.02.006

    Article  Google Scholar 

  11. Dumas, M., García-Bañuelos, L., Laud, P.: Differential privacy analysis of data processing workflows. In: Proceedings of the Third International Workshop GraMSec 2016, pp. 62–79 (2016)

  12. Dumas, M., García-Bañuelos, L., Laud, P.: Disclosure analysis of SQL workflows. In: Cybenko, G., Pym, D.J., Fila, B. (eds.) 5th International Workshop on Graphical Models for Security, held in conjunction with the Federated Logic Conference (FLoC) 2018, GraMSec@FLoC 2018, Oxford, UK, July 8, 2018, Revised Selected Papers, Lecture Notes in Computer Science, vol. 11086, pp. 51–70. Springer (2018). https://doi.org/10.1007/978-3-030-15465-3_4

  13. Dwork, C.: Differential privacy. In: Bugliesi, M., Preneel, B., Sassone, V., Wegener, I. (eds.) Automata, Languages and Programming, 33rd International Colloquium, ICALP 2006, Venice, Italy, 10–14 July 2006, Proceedings, Part II, Lecture Notes in Computer Science, vol. 4052, pp. 1–12. Springer (2006). https://doi.org/10.1007/11787006_1

  14. Dwork, C., McSherry, F., Nissim, K., Smith, A.D.: Calibrating noise to sensitivity in private data analysis. In: Halevi, S., Rabin, T. (eds.) Theory of Cryptography, Third Theory of Cryptography Conference, TCC 2006, New York, NY, USA, 4–7 March 2006, Proceedings, Lecture Notes in Computer Science, vol. 3876, pp. 265–284. Springer (2006). https://doi.org/10.1007/11681878

  15. Dwork, C., Roth, A.: The algorithmic foundations of differential privacy. Found. Trends Theor. Comput. Sci. 9(3–4), 211–407 (2014). https://doi.org/10.1561/0400000042

    Article  MathSciNet  MATH  Google Scholar 

  16. Esparza, J., Heljanko, K.: Unfoldings: A Partial-Order Approach to Model Checking. Monographs in Theoretical Computer Science. An EATCS Series. Springer, Berlin (2008). https://doi.org/10.1007/978-3-540-77426-6

    Book  MATH  Google Scholar 

  17. Esparza, J., Römer, S., Vogler, W.: An improvement of Macmillan’s unfolding algorithm. Form. Methods Syst. Des. 20(3), 285–310 (2002). https://doi.org/10.1023/A:1014746130920

    Article  MATH  Google Scholar 

  18. Groote, J.F., Keiren, J.J.A., Luttik, B., de Vink, E.P., Willemse, T.A.C.: Modelling and analysing software in mcrl2. In: Arbab, F., Jongmans, S. (eds.) Formal Aspects of Component Software: 16th International Conference, FACS 2019, Amsterdam, The Netherlands, 23–25 October 2019, Proceedings, Lecture Notes in Computer Science, vol. 12018, pp. 25–48. Springer (2019). https://doi.org/10.1007/978-3-030-40914-2_2

  19. Hundepool, A., Domingo-Ferrer, J., Franconi, L., Giessing, S., Nordholt, E.S., Spicer, K., de Wolf, P.: Statistical Disclosure Control. Wiley, Hoboken (2012)

    Book  Google Scholar 

  20. Kifer, D., Machanavajjhala, A.: Pufferfish: a framework for mathematical privacy definitions. ACM Trans. Database Syst. 39(1), 3:1–3:36 (2014). https://doi.org/10.1145/2514689

    Article  MathSciNet  MATH  Google Scholar 

  21. Ladha, W., Mehandjiev, N., Sampaio, P.: Modelling of privacy-aware business processes in bpmn to protect personal data. In: Proceedings of the 29th Annual ACM Symposium on Applied Computing, pp. 1399–1405 (2014)

  22. Laud, P., Pankova, A.: Interpreting epsilon of differential privacy in terms of advantage in guessing or approximating sensitive attributes. CoRR arXiv:1911.12777 (2020)

  23. Laud, P., Pankova, A., Pettai, M.: A framework of metrics for differential privacy from local sensitivity. In: Proceedings on Privacy Enhancing Technologies (PoPETs) (2020). To appear

  24. Lee, J., Clifton, C.: How much is enough? Choosing \(\epsilon \) for differential privacy. In: International Conference on Information Security (ISC), pp. 325–340. Springer (2011)

  25. Pullonen, P., Matulevičius, R., Bogdanov, D.: PE-BPMN: privacy-enhanced business process model and notation. In: Proceedings of the 15th International Conference on Business Process Management (BPM), pp. 40–56. Springer (2017)

  26. Pullonen, P., Tom, J., Matulevicius, R., Toots, A.: Privacy-enhanced BPMN: enabling data privacy analysis in business processes models. Softw. Syst. Model. 18(6), 3235–3264 (2019)

    Article  Google Scholar 

  27. Ramadan, G., Strüber, D., Salnitri, M., Jürjens, J., Riediger, V.S.S.: A semi-automated BPMN-based framework for detecting conflicts between security, data-minimization, and fairness requirements. Software and Systems Modeling (2020)

  28. Reisig, W., Rozenberg, G.: Informal introduction to petri nets. In: Reisig, W., Rozenberg, G. (eds.) Lectures on Petri Nets I: Basic Models, Advances in Petri Nets, the volumes are based on the Advanced Course on Petri Nets, held in Dagstuhl, September 1996, Lecture Notes in Computer Science, vol. 1491, pp. 1–11. Springer (1996). https://doi.org/10.1007/3-540-65306-6_13

  29. Toots, A., Tuuling, R., Yerokhin, M., Dumas, M., García-Bañuelos, L., Laud, P., Matulevicius, R., Pankova, A., Pettai, M., Pullonen, P., Tom, J.: Business process privacy analysis in pleak. In: Proceedings of the 22nd International Conference on Fundamental Approaches to Software Engineering (FASE). Prague, Czechia (2019)

  30. Toots, A., Tuuling, R., Yerokhin, M., Dumas, M., García-Bañuelos, L., Laud, P., Matulevicius, R., Pankova, A., Pettai, M., Pullonen, P., Tom, J.: Business process privacy analysis in pleak-(extended abstract). Informatik Spektrum 42(5), 354–355 (2019)

    Article  Google Scholar 

  31. van der Aalst, W.M.P.: The application of petri nets to workflow management. J. Circuits Syst. Comput. 8(1), 21–66 (1998). https://doi.org/10.1142/S0218126698000043

    Article  Google Scholar 

  32. Wagner, I., Eckhoff, D.: Technical privacy metrics: a systematic survey. ACM Comput. Surv. 51(3), 57:1–57:38 (2018). https://doi.org/10.1145/3168389

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Raimundas Matulevičius.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This research was funded by the Air Force Research laboratory (AFRL) and Defense Advanced Research Projects Agency (DARPA) under contract FA8750-16-C-0011. The views expressed are those of the author(s) and do not reflect the official policy or position of the Department of Defense or the U.S. Government. This research has been also supported by the Estonian Personal Research Grant Number 920.

SQL scripts for the example model

SQL scripts for the example model

This section gives all the SQL specifications of the example model. The same scripts can be seen when opening the model in Pleak in https://pleak.io/app/#/view/HHQEWDWiweWX2HOmTGU4.

1.1 Input data

1.1.1 Port

figure o

For the guessing advantage and sensitivity analysis, this table has the following constraints:

figure p

1.1.2 Ship

figure q

For the guessing advantage and sensitivity analysis, this table has the following constraints:

figure r

1.1.3 Berth

figure s

1.1.4 Slot

figure t

For the guessing advantage and sensitivity analysis, this table has the following constraints:

figure u

1.2 Data processing tasks

1.2.1 Count the number of reachable ships for each port

figure v

1.2.2 Compute the number of ships that can fit to the port

figure w

1.2.3 Compute the number of ships that should go to the port

figure x

1.2.4 Normalize ships order

figure y

1.2.5 Select ships according to capacities

figure z

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Dumas, M., García-Bañuelos, L., Jääger, J. et al. Multi-level privacy analysis of business processes: the Pleak toolset. Int J Softw Tools Technol Transfer 24, 183–203 (2022). https://doi.org/10.1007/s10009-021-00636-w

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10009-021-00636-w

Keywords

Navigation