skip to main content
10.1145/3030207.3044530acmconferencesArticle/Chapter ViewAbstractPublication PagesicpeConference Proceedingsconference-collections
abstract

CloudPerf: A Performance Test Framework for Distributed and Dynamic Multi-Tenant Environments

Published: 17 April 2017 Publication History

Abstract

The evolution of cloud-computing imposes many challenges on performance testing and requires not only a different approach and methodology of performance evaluation and analysis, but also specialized tools and frameworks to support such work. In traditional performance testing, typically a single workload was run against a static test configuration. The main metrics derived from such experiments included throughput, response times, and system utilization at steady-state. While this may have been sufficient in the past, where in many cases a single application was run on dedicated hardware, this approach is no longer suitable for cloud-based deployments. Whether private or public cloud, such environments typically host a variety of applications on distributed shared hardware resources, simultaneously accessed by a large number of tenants running heterogeneous workloads. The number of tenants as well as their activity and resource needs dynamically change over time, and the cloud infrastructure reacts to this by reallocating existing or provisioning new resources. Besides metrics such as the number of tenants and overall resource utilization, performance testing in the cloud must be able to answer many more questions: How is the quality of service of a tenant impacted by the constantly changing activity of other tenants? How long does it take the cloud infrastructure to react to changes in demand, and what is the effect on tenants while it does so? How well are service level agreements met? What is the resource consumption of individual tenants? How can global performance metrics on application- and system-level in a distributed system be correlated to an individual tenant's perceived performance?
In this paper we present CloudPerf, a performance test framework specifically designed for distributed and dynamic multi-tenant environments, capable of answering all of the above questions, and more. CloudPerf consists of a distributed harness, a protocol-independent load generator and workload modeling framework, an extensible statistics framework with live-monitoring and post-analysis tools, interfaces for cloud deployment operations, and a rich set of both low-level as well as high-level workloads from different domains.

References

[1]
Apache jmeter. http://jmeter.apache.org/. Accessed: 2016-09--17.
[2]
Faban. http://faban.org. Accessed: 2016-09-17.
[3]
Openstack rally. https://wiki.openstack.org/wiki/Rally. Accessed: 2016-09-17.
[4]
Performance data analyzer. http://pda.nmichael.de/. Accessed: 2016-09-17.
[5]
uperf. http://www.uperf.org/. Accessed: 2016-09-17.
[6]
vdbench. http://vdbench.sourceforge.net. Accessed: 2016-09-17.
[7]
C. Binnig, D. Kossmann, T. Kraska, and S. Loesing. How is the weather tomorrow?: towards a benchmark for the cloud. In Proceedings of the Second International Workshop on Testing Database Systems, page 9. ACM, 2009.
[8]
R. N. Calheiros, R. Ranjan, A. Beloglazov, C. A. De Rose, and R. Buyya. Cloudsim: a toolkit for modeling and simulation of cloud computing environments and evaluation of resource provisioning algorithms. Software: Practice and Experience, 41(1):23--50, 2011.
[9]
B. F. Cooper, A. Silberstein, E. Tam, R. Ramakrishnan, and R. Sears. Benchmarking cloud serving systems with YCSB. In Proceedings of the 1st ACM symposium on Cloud computing, pages 143--154. ACM, 2010.
[10]
D. E. Difallah, A. Pavlo, C. Curino, and P. Cudre-Mauroux. Oltp-bench: An extensible testbed for benchmarking relational databases. Proceedings of the VLDB Endowment, 7(4), 2013.
[11]
E. Folkerts, A. Alexandrov, K. Sachs, A. Iosup, V. Markl, and C. Tosun. Benchmarking in the cloud: What it should, can, and cannot be. In Technology Conference on Performance Evaluation and Benchmarking, pages 173--188. Springer, 2012.
[12]
D. Giles. Swingbench 2.2 reference and user guide.
[13]
A. Göbel. Mutebench: Turning OLTP-Bench into a multi-tenancy database benchmark framework. In CLOUD COMPUTING 2014, The Fifth International Conference on Cloud Computing, GRIDs, and Virtualization, pages 84--87, 2014.
[14]
N. Michael and Y. Shen. Downtime-free live migration in a multitenant database. In Technology Conference on Performance Evaluation and Benchmarking, pages 130--155. Springer, 2014.
[15]
S. Patil, M. Polte, K. Ren, W. Tantisiriroj, L. Xiao, J. López, G. Gibson, A. Fuchs, and B. Rinaldi. Ycsb: benchmarking and performance debugging advanced features in scalable table stores. In Proceedings of the 2nd ACM Symposium on Cloud Computing, page 9. ACM, 2011.
[16]
B. Schroeder, A. Wierman, and M. Harchol-Balter. Open versus closed: A cautionary tale. In NSDI, volume 6, pages 18--18, 2006.
[17]
Y. Shen and N. Michael. Oracle Multitenant on SuperCluster T5--8: Scalability study. Oracle White Paper, 2014.
[18]
M. Silva, M. R. Hines, D. Gallo, Q. Liu, K. D. Ryu, and D. Da Silva. Cloudbench: experiment automation for cloud environments. In Cloud Engineering (IC2E), 2013 IEEE International Conference on, pages 302--311. IEEE, 2013.
[19]
W. Sobel, S. Subramanyam, A. Sucharitakul, J. Nguyen, H. Wong, A. Klepchukov, S. Patil, A. Fox, and D. Patterson. Cloudstone: Multi-platform, multi-language benchmark and measurement tools for web 2.0. In Proc. of CCA, volume 8, 2008.
[20]
A. Tirumala, F. Qin, J. Dugan, J. Ferguson, and K. Gibbs. Iperf: The TCP/UDP bandwidth measurement tool. 2005.

Cited By

View all
  • (2023)Bulut Bilişim Sanal Sunucu Ürün Seçiminde Çok Kriterli Bir Karar Destek ModeliA Multi-Criteria Decision Support Model for Cloud Computing Virtual Server Product SelectionÇukurova Üniversitesi Mühendislik Fakültesi Dergisi10.21605/cukurovaumfd.141026938:4(939-953)Online publication date: 28-Dec-2023
  • (2022)Making the Cloud Monitor Real-Time Adaptive2022 IEEE Cloud Summit10.1109/CloudSummit54781.2022.00017(69-74)Online publication date: Oct-2022
  • (2021)An autonomous performance testing framework using self-adaptive fuzzy reinforcement learningSoftware Quality Journal10.1007/s11219-020-09532-z30:1(127-159)Online publication date: 10-Mar-2021
  • Show More Cited By

Index Terms

  1. CloudPerf: A Performance Test Framework for Distributed and Dynamic Multi-Tenant Environments

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ICPE '17: Proceedings of the 8th ACM/SPEC on International Conference on Performance Engineering
      April 2017
      450 pages
      ISBN:9781450344043
      DOI:10.1145/3030207
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 17 April 2017

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. cloud
      2. load generation
      3. multi-tenancy
      4. performance testing
      5. statistics collection
      6. workload modeling

      Qualifiers

      • Abstract

      Conference

      ICPE '17
      Sponsor:

      Acceptance Rates

      ICPE '17 Paper Acceptance Rate 27 of 83 submissions, 33%;
      Overall Acceptance Rate 252 of 851 submissions, 30%

      Upcoming Conference

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)20
      • Downloads (Last 6 weeks)1
      Reflects downloads up to 08 Mar 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2023)Bulut Bilişim Sanal Sunucu Ürün Seçiminde Çok Kriterli Bir Karar Destek ModeliA Multi-Criteria Decision Support Model for Cloud Computing Virtual Server Product SelectionÇukurova Üniversitesi Mühendislik Fakültesi Dergisi10.21605/cukurovaumfd.141026938:4(939-953)Online publication date: 28-Dec-2023
      • (2022)Making the Cloud Monitor Real-Time Adaptive2022 IEEE Cloud Summit10.1109/CloudSummit54781.2022.00017(69-74)Online publication date: Oct-2022
      • (2021)An autonomous performance testing framework using self-adaptive fuzzy reinforcement learningSoftware Quality Journal10.1007/s11219-020-09532-z30:1(127-159)Online publication date: 10-Mar-2021
      • (2020)Workload Diffusion Modeling for Distributed Applications in Fog/Edge Computing EnvironmentsProceedings of the ACM/SPEC International Conference on Performance Engineering10.1145/3358960.3379135(218-229)Online publication date: 20-Apr-2020
      • (2020)Taxonomy of performance testing toolsProceedings of the 35th Annual ACM Symposium on Applied Computing10.1145/3341105.3374006(1997-2004)Online publication date: 29-Mar-2020
      • (2020)ImageJockey: A Framework for Container Performance Engineering2020 IEEE 13th International Conference on Cloud Computing (CLOUD)10.1109/CLOUD49709.2020.00043(238-247)Online publication date: Oct-2020
      • (2019)A Systematic Review on Cloud TestingACM Computing Surveys10.1145/333144752:5(1-42)Online publication date: 13-Sep-2019
      • (2019)Benchmark HarnessEncyclopedia of Big Data Technologies10.1007/978-3-319-77525-8_134(137-141)Online publication date: 20-Feb-2019
      • (2018)One Size Does Not Fit AllProceedings of the 2018 ACM/SPEC International Conference on Performance Engineering10.1145/3184407.3184418(211-222)Online publication date: 30-Mar-2018
      • (2018)A Declarative Approach for Performance Tests Execution in Continuous Software Development EnvironmentsProceedings of the 2018 ACM/SPEC International Conference on Performance Engineering10.1145/3184407.3184417(261-272)Online publication date: 30-Mar-2018
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media