skip to main content
10.1145/3302541.3310294acmconferencesArticle/Chapter ViewAbstractPublication PagesicpeConference Proceedingsconference-collections
research-article

Performance Benchmarking of Infrastructure-as-a-Service (IaaS) Clouds with Cloud WorkBench

Published: 27 March 2019 Publication History

Abstract

The continuing growth of the cloud computing market has led to an unprecedented diversity of cloud services with different performance characteristics. To support service selection, researchers and practitioners conduct cloud performance benchmarking by measuring and objectively comparing the performance of different providers and configurations (e.g., instance types in different data center regions). In this tutorial, we demonstrate how to write performance tests for IaaS clouds using the Web-based benchmarking tool Cloud WorkBench (CWB). We will motivate and introduce benchmarking of IaaS cloud in general, demonstrate the execution of a simple benchmark in a public cloud environment, summarize the CWB tool architecture, and interactively develop and deploy a more advanced benchmark together with the participants.

References

[1]
1}Michael Armbrust, Armando Fox, Rean Griffith, Anthony D. Joseph, Randy H.Katz, Andrew Konwinski, Gunho Lee, David A. Patterson, Ariel Rabkin, and Matei Zaharia. 2009. Above the Clouds: A Berkeley View of Cloud Computing.Technical Report UCB/EECS-2009-28. EECS Department, University of California, Berkeley. 25 pages. http://www.eecs.berkeley.edu/Pubs/TechRpts/2009/EECS-2009--28.html
[2]
Matheus Cunha, Nabor Mendonça, and Américo Sampaio. 2013. A Declarative Environment for Automatic Performance Evaluation in IaaS Clouds. In Sixth IEEE International Conference on Cloud Computing (CLOUD). 285--292.
[3]
M. Cunha, N. C. Mendonça, and A. Sampaio. 2017.Cloud Crawler:a declarative performance evaluation environment for infrastructure-as-a-service clouds. Concurrency and Computation: Practice and Experience29, 1 (2017), e3825. arXiv:https://onlinelibrary.wiley.com/doi/pdf/10.1002/cpe.3825
[4]
Christian Davatz, Christian Inzinger, Joel Scheuner, and Philipp Leitner. 2017. An Approach and Case Study of Cloud Instance Type Selection for Multi-Tier Web Applications. In 17th IEEE/ACM International Symposium on Cluster, Cloud and Grid Computing (CCGrid). 534--543.
[5]
Benjamin Farley, Ari Juels, Venkatanathan Varadarajan, Thomas Ristenpart, Kevin D. Bowers, and Michael M. Swift. 2012. More for Your Money: Exploiting Performance Heterogeneity in Public Clouds. In Proceedings of the Third ACM Symposium on Cloud Computing (SoCC '12). 20:1--20:14.
[6]
Roy T. Fielding and Richard N. Taylor. 2000. Architectural styles and the design ofnetwork-based software architectures. Ph.D. Dissertation. University of California, Irvine Irvine, USA.
[7]
Michael Hüttermann. 2012. Infrastructure as Code. Apress. 35--156 pages.
[8]
Alexandru Iosup, Simon Ostermann, Nezih Yigitbasi, Radu Prodan, Thomas Fahringer, and Dick Epema. 2011. Performance Analysis of Cloud Computing Services for Many-Tasks Scientific Computing. IEEE Transactions on Parallel and Distributed Systems 22, 6 (June 2011), 931--945.
[9]
Alexandru Iosup, Radu Prodan, and Dick Epema. 2014. IaaS Cloud Benchmarking: Approaches, Challenges, and Experience. In Cloud Computing for Data-Intensive Applications, Xiaolin Li and Judy Qiu (Eds.). Springer New York, 83--104.
[10]
Deepal Jayasinghe, Josh Kimball, Siddharth Choudhary, Tao Zhu, and Calton Pu. 2013. An automated approach to create, store, and analyze large-scale experimental data in clouds. In14th IEEE International Conference on Information Reuseand Integration (IRI). 357--364.
[11]
Deepal Jayasinghe, Galen Swint, Simon Malkowski, Jack Li, Qingyang Wang, Junhee Park, and Calton Pu. 2012. Expertus: A Generator Approach to Automate Performance Testing in IaaS Clouds. In 5th IEEE International Conference on Cloud Computing (CLOUD). 115--122.
[12]
Christoph Laaber, Joel Scheuner, and Philipp Leitner. 2019. Performance testing in the cloud. How bad is it really? Empirical Software Engineering(2019). To appear. Preprint http://t.uzh.ch/T4.
[13]
Philipp Leitner and Jürgen Cito. 2016. Patterns in the Chaos - A Study of Performance Variation and Predictability in Public IaaS Clouds. ACM Trans. Internet Technol. 16, 3 (April 2016), 15:1--15:23.
[14]
Peter Mell and Timothy Grance. 2011. The NIST Definition of Cloud Computing. Technical Report 800--145. National Institute of Standards and Technology (NIST). http://csrc.nist.gov/publications/nistpubs/800-145/SP800-145.pdf
[15]
Simon Ostermann, Alexandria Iosup, Nezih Yigitbasi, Radu Prodan, Thomas Fahringer, and Dick Epema. 2009. A performance analysis of EC2 cloud computing services for scientific computing. In Cloud Computing. Vol. 34. Springer, 115--131.
[16]
Z. Ou, H. Zhuang, A. Lukyanenko, J. K. Nurminen, P. Hui, V. Mazalov, and A. Ylä-Jääski. 2013. Is the Same Instance Type Created Equal? Exploiting Heterogeneity of Public Clouds. IEEE Transactions on Cloud Computing 1, 2 (July 2013), 201--214.
[17]
Jörg Schad, Jens Dittrich, and Jorge-Arnulfo Quiané-Ruiz. 2010. Runtime Measure-ments in the Cloud: Observing, Analyzing, and Reducing Variance. Proceedings of the VLDB Endowment 3, 1 (Sept. 2010), 460--471.
[18]
J. Scheuner and P. Leitner. 2018. Estimating Cloud Application Performance Based on Micro-Benchmark Profiling. In IEEE 11th International Conference on Cloud Computing (CLOUD). 90--97.
[19]
M. Silva, M.R. Hines, D. Gallo, Qi Liu, Kyung Dong Ryu, and D. Da Silva. 2013. Cloud Bench: Experiment Automation for Cloud Environments. InIEEE International Conference on Cloud Engineering (IC2E). 302--311.
[20]
Edward Walker. 2008. Benchmarking Amazon EC2 for High-Performance Scientific Computing. Usenix Login 33, 5 (October 2008), 18--23

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICPE '19: Companion of the 2019 ACM/SPEC International Conference on Performance Engineering
March 2019
99 pages
ISBN:9781450362863
DOI:10.1145/3302541
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 27 March 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. benchmarking
  2. cloud computing
  3. performance

Qualifiers

  • Research-article

Conference

ICPE '19

Acceptance Rates

Overall Acceptance Rate 252 of 851 submissions, 30%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 340
    Total Downloads
  • Downloads (Last 12 months)11
  • Downloads (Last 6 weeks)0
Reflects downloads up to 17 Feb 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media