Skip to main content

MSDBench: Understanding the Performance Impact of Isolation Domains on Microservice-Based IoT Deployments

  • Conference paper
  • First Online:
  • 324 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13852))

Abstract

We present MSDBench – a set of benchmarks designed to illuminate the effects of deployment choices and operating system abstractions on microservices performance in IoT settings. The microservices architecture has emerged as a mainstay set of design principles for cloud-hosted, network-facing applications. Their utility as a design pattern for “The Internet of Things” (IoT) is less well understood.

We use MSDBench to show the performance impacts of different deployment choices and isolation domain assignments for Linux and Ambience, an experimental operating system specifically designed to support microservices for IoT. These results indicate that deployment choices can have a dramatic impact on microservices performance, and thus, MSDBench is a useful tool for developers and researchers in this space.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   44.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   59.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Aderaldo, C.M., Mendonça, N.C., Pahl, C., Jamshidi, P.: Benchmark requirements for microservices architecture research. In: 2017 IEEE/ACM 1st International Workshop on Establishing the Community-Wide Infrastructure for Architecture-Based Software Engineering (ECASE), pp. 8–13. IEEE (2017)

    Google Scholar 

  2. Akbulut, A., Perros, H.G.: Performance analysis of microservice design patterns. IEEE Internet Comput. 23(6), 19–27 (2019)

    Article  Google Scholar 

  3. Ambience Microservices OS (2022). https://github.com/MAYHEM-Lab/ambience. Accessed 20 May 2022

  4. Ansible configuration management. https://www.ansible.com. Accessed 20 July 2022

  5. Arlitt, M., Marwah, M., Bellala, G., Shah, A., Healey, J., Vandiver, B.: IoTAbench: an internet of things analytics benchmark. In: Proceedings of the 6th ACM/SPEC International Conference on Performance Engineering, pp. 133–144 (2015)

    Google Scholar 

  6. AWS elastic container service. https://aws.amazon.com/ecs/. Accessed 20 July 2022

  7. Chef configuration management. https://www.chef.io. Accessed 20 July 2022

  8. Decomposing Twitter: Adventures in service-oriented architecture. https://www.slideshare.net/InfoQ/decomposing-twitter-adventures-in-serviceoriented-architecture. Accessed 19 July 2022

  9. Everything you need to know about microservices design patterns. https://www.edureka.co/blog/microservices-design-patterns. Accessed 20 July 2022

  10. Docker. https://www.docker.com. Accessed 12 Sept 2017

  11. Docker Swarm. https://docs.docker.com/engine/swarm/. Accessed 20 July 2022

  12. Dragoni, N., et al.: Microservices: yesterday, today, and tomorrow. In: Present and Ulterior Software Engineering, pp. 195–216. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-67425-4_12

    Chapter  Google Scholar 

  13. Dragoni, N., Lanese, I., Larsen, S.T., Mazzara, M., Mustafin, R., Safina, L.: Microservices: how to make your application scale. In: Petrenko, A.K., Voronkov, A. (eds.) PSI 2017. LNCS, vol. 10742, pp. 95–104. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-74313-4_8

    Chapter  Google Scholar 

  14. The evolution of microservices. https://www.slideshare.net/adriancockcroft/evolution-of-microservices-craft-conference. Accessed 19 July 2022

  15. Ferdman, M., et al.: Clearing the clouds: a study of emerging scale-out workloads on modern hardware. ACM SIGPLAN Not. 47(4), 37–48 (2012)

    Article  Google Scholar 

  16. Gan, Y., et al.: An open-source benchmark suite for microservices and their hardware-software implications for cloud & edge systems. In: International Conference on Architectural Support for Programming Languages and Operating Systems (2019)

    Google Scholar 

  17. Grambow, M., Meusel, L., Wittern, E., Bermbach, D.: Benchmarking microservice performance: a pattern-based approach. In: Proceedings of the 35th Annual ACM Symposium on Applied Computing, pp. 232–241 (2020)

    Google Scholar 

  18. Grambow, M., Wittern, E., Bermbach, D.: Benchmarking the performance of microservice applications. ACM SIGAPP Appl. Comput. Rev. 20(3), 20–34 (2020)

    Article  Google Scholar 

  19. Gupta, P., Carey, M.J., Mehrotra, S., Yus, O.: SmartBench: a benchmark for data management in smart spaces. Proc. VLDB Endow. 13(12), 1807–1820 (2020)

    Article  Google Scholar 

  20. Hauswald, J., et al.: Sirius: an open end-to-end voice and vision personal assistant and its implications for future warehouse scale computers. In: International Conference on Architectural Support for Programming Languages and Operating Systems, pp. 223–238 (2015)

    Google Scholar 

  21. Henning, S., Hasselbring, W.: Theodolite: scalability benchmarking of distributed stream processing engines in microservice architectures. Big Data Res. 25, 100209 (2021)

    Article  Google Scholar 

  22. Jia, Z., Witchel, E.: Nightcore: efficient and scalable serverless computing for latency-sensitive, interactive microservices. In: International Conference on Architectural Support for Programming Languages and Operating Systems, pp. 152–166 (2021)

    Google Scholar 

  23. Jindal, A., Podolskiy, V., Gerndt, M.: Performance modeling for cloud microservice applications. In: Proceedings of the 2019 ACM/SPEC International Conference on Performance Engineering, pp. 25–32 (2019)

    Google Scholar 

  24. K3S. https://k3s.io. Accessed 19 July 2022

  25. Kasture, H., Sanchez, D.: Tailbench: a benchmark suite and evaluation methodology for latency-critical applications. In: International Symposium on Workload Characterization (2016)

    Google Scholar 

  26. Kratzke, N., Quint, P.C.: Investigation of impacts on network performance in the advance of a microservice design. In: International Conference on Cloud Computing and Services Science, vol. 1 and 2, pp. 223–231 (2016)

    Google Scholar 

  27. Kruger, C.P., Hancke, G.P.: Benchmarking internet of things devices. In: 2014 12th IEEE International Conference on Industrial Informatics (INDIN), pp. 611–616. IEEE (2014)

    Google Scholar 

  28. KubeEdge. https://kubeedge.io. Accessed 19 July 2022

  29. Kubernetes. https://kubernetes.io. Accessed 19 July 2022

  30. Kumar, H.A., Rakshith, J., Shetty, R., Roy, S., Sitaram, D.: Comparison of IoT architectures using a smart city benchmark. Procedia Comput. Sci. 171, 1507–1516 (2020)

    Article  Google Scholar 

  31. Microservices. https://martinfowler.com/articles/microservices.html

  32. Microservices workshop: why, what, and how to get there. http://www.slideshare.net/adriancockcroft/microservices-workshop-craft-conference. Accessed 19 July 2022

  33. Newman, S.: Building Microservices. O’Reilly Media, Inc. (2021)

    Google Scholar 

  34. Papapanagiotou, I., Chella, V.: NDBench: benchmarking microservices at scale. arXiv preprint arXiv:1807.10792 (2018)

  35. Paul, S.K., Jana, S., Bhaumik, P.: On solving heterogeneous tasks with microservices. J. Inst. Eng. (India) Ser. B 103(2), 557–565 (2022)

    Article  Google Scholar 

  36. Poess, M., Nambiar, R., Kulkarni, K., Narasimhadevara, C., Rabl, T., Jacobsen, H.A.: Analysis of TPCx-IoT: the first industry standard benchmark for IoT gateway systems. In: 2018 IEEE 34th International Conference on Data Engineering (ICDE), pp. 1519–1530. IEEE (2018)

    Google Scholar 

  37. Puppet configuration management. https://puppet.com. Accessed 20 July 2022

  38. Shukla, A., Chaturvedi, S., Simmhan, Y.: Riotbench: a real-time IoT benchmark for distributed stream processing platforms. arXiv preprint arXiv:1701.08530 (2017)

  39. Slee, M., Agarwal, A., Kwiatkowski, M.: Thrift: scalable cross-language services implementation (2007). Facebook White Paper

    Google Scholar 

  40. Soldani, J., Tamburri, D.A., Van Den Heuvel, W.J.: The pains and gains of microservices: a systematic grey literature review. J. Syst. Softw. 146, 215–232 (2018)

    Article  Google Scholar 

  41. Sriraman, A., Wenisch, T.F.: usuite: a benchmark suite for microservices. In: International Symposium on Workload Characterization, pp. 1–12 (2018)

    Google Scholar 

  42. Thrift software framework. http://wiki.apache.org/thrift/

  43. Ueda, T., Nakaike, T., Ohara, M.: Workload characterization for microservices. In: International Symposium on Workload Characterization (2016)

    Google Scholar 

  44. Villamizar, M., et al.: Evaluating the monolithic and the microservice architecture pattern to deploy web applications in the cloud. In: 2015 10th Computing Colombian Conference (10CCC), pp. 583–590. IEEE (2015)

    Google Scholar 

  45. Wang, L., et al.: BigDataBench: a big data benchmark suite from internet services. In: Proceedings of the First International Symposium on High-Performance Computer Architecture, pp. 488–499 (2014)

    Google Scholar 

  46. Yeung, A.: The six most common microservice architecture design pattern (2020). https://medium.com/analytics-vidhya/the-six-most-common-microservice-architecture-design-pattern-1038299dc396. Accessed 20 July 2022

  47. Zhou, X., et al.: Fault analysis and debugging of microservice systems: industrial survey, benchmark system, and empirical study. IEEE Trans. Softw. Eng. 47(2), 243–260 (2018)

    Article  Google Scholar 

  48. Zhou, X., et al.: Benchmarking microservice systems for software engineering research. In: International Conference on Software Engineering, pp. 323–324 (2018)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sierra Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Wang, S., Bakir, F., Ekaireb, T., Pearson, J., Krintz, C., Wolski, R. (2023). MSDBench: Understanding the Performance Impact of Isolation Domains on Microservice-Based IoT Deployments. In: Gainaru, A., Zhang, C., Luo, C. (eds) Benchmarking, Measuring, and Optimizing. Bench 2022. Lecture Notes in Computer Science, vol 13852. Springer, Cham. https://doi.org/10.1007/978-3-031-31180-2_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-31180-2_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-31179-6

  • Online ISBN: 978-3-031-31180-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics