skip to main content
10.1145/2649563.2649571acmotherconferencesArticle/Chapter ViewAbstractPublication PagesicpeConference Proceedingsconference-collections
research-article

Towards a Resource Elasticity Benchmark for Cloud Environments

Published: 22 March 2014 Publication History

Abstract

Auto-scaling features offered by today's cloud infrastructures provide increased flexibility especially for customers that experience high variations in the load intensity over time. However, auto-scaling features introduce new system quality attributes when considering their accuracy, timing, and boundaries. Therefore, distinguishing between different offerings has become a complex task, as it is not yet supported by reliable metrics and measurement approaches. In this paper, we discuss shortcomings of existing approaches for measuring and evaluating elastic behavior and propose a novel benchmark methodology specifically designed for evaluating the elasticity aspects of modern cloud platforms. The benchmark is based on open workloads with realistic load variation profiles that are calibrated to induce identical resource demand variations independent of the underlying hardware performance. Furthermore, we propose new metrics that capture the accuracy of resource allocations and de-allocations, as well as the timing aspects of an auto-scaling mechanism explicitly.

References

[1]
M. Armbrust, A. Fox, R. Griffith, A. D. Joseph, R. Katz, A. Konwinski, G. Lee, D. Patterson, A. Rabkin, I. Stoica, and M. Zaharia. A View of Cloud Computing. Commun. ACM, 53(4):50--58, Apr. 2010.
[2]
C. Binnig, D. Kossmann, T. Kraska, and S. Loesing. How is the weather tomorrow?: towards a benchmark for the cloud. In Proceedings of the Second International Workshop on Testing Database Systems, DBTest '09, pages 9:1--9:6, New York, NY, USA, 2009. ACM.
[3]
T. Dory, B. Mejías, P. V. Roy, and N.-L. Tran. Measuring Elasticity for Cloud Databases. In Proceedings of the The Second International Conference on Cloud Computing, GRIDs, and Virtualization, 2011.
[4]
L. Duboc, D. S. Rosenblum, and T. Wicks. A framework for modelling and analysis of software systems scalability. In Proceedings of the 28th international conference on Software engineering, ICSE '06, pages 949--952, New York, NY, USA, 2006. ACM.
[5]
E. Folkerts, A. Alexandrov, K. Sachs, A. Iosup, V. Markl, and C. Tosun. Benchmarking in the Cloud: What It Should, Can, and Cannot Be. In R. Nambiar and M. Poess, editors, Selected Topics in Performance Evaluation and Benchmarking, volume 7755 of Lecture Notes in Computer Science, pages 173--188. Springer Berlin Heidelberg, 2012.
[6]
G. Galante and L. C. E. d. Bona. A Survey on Cloud Computing Elasticity. In Proceedings of the 2012 IEEE/ACM Fifth International Conference on Utility and Cloud Computing, UCC '12, pages 263--270, Washington, DC, USA, 2012. IEEE Computer Society.
[7]
N. R. Herbst, S. Kounev, and R. Reussner. Elasticity in Cloud Computing: What it is, and What it is Not (Short Paper). In Proceedings of the 10th International Conference on Autonomic Computing (ICAC 2013). USENIX, June 2013.
[8]
S. Islam, K. Lee, A. Fekete, and A. Liu. How a consumer can measure elasticity for cloud platforms. In Proceedings of the 3rd ACM/SPEC International Conference on Performance Engineering, ICPE '12, pages 85--96, New York, NY, USA, 2012. ACM.
[9]
B. Jennings and R. Stadler. Resource management in clouds: Survey and research challenges. Journal of Network and Systems Management, pages 1--53, 2014.
[10]
E. J. Keogh and M. J. Pazzani. Scaling up dynamic time warping to massive dataset. In Proceedings of the Third European Conference on Principles of Data Mining and Knowledge Discovery, PKDD '99, pages 1--11, London, UK, 1999.
[11]
M. Kuperberg, N. R. Herbst, J. G. von Kistowski, and R. Reussner. Defining and Quantifying Elasticity of Resources in Cloud Computing and Scalable Platforms. Technical report, Karlsruhe Institute of Technology (KIT), 2011.
[12]
A. Li, X. Yang, S. Kandula, and M. Zhang. CloudCmp: Comparing Public Cloud Providers. In Proceedings of the 10th ACM SIGCOMM Conference on Internet Measurement, IMC '10, pages 1--14, New York, NY, USA, 2010. ACM.
[13]
M. Salehie and L. Tahvildari. Self-adaptive Software: Landscape and Research Challenges. ACM Trans. Auton. Adapt. Syst., 4(2):14:1--14:42, May 2009.
[14]
P. Shivam, V. Marupadi, J. Chase, T. Subramaniam, and S. Babu. Cutting corners: Workbench automation for server benchmarking. In USENIX 2008 Annual Technical Conference on Annual Technical Conference, ATC'08, pages 241--254, Berkeley, CA, USA, 2008. USENIX Association.
[15]
M. van Steen, S. Van der Zijden, and H. J. Sips. Software engineering for the scalable distributed applications. In Computer Software and Applications Conference, 1998. COMPSAC '98. Proceedings., pages 285--292, 1998.
[16]
J. G. von Kistowski, N. R. Herbst, and S. Kounev. LIMBO: A Tool For Modeling Variable Load Intensities (Demo Paper). In Proceedings of the 5th ACM/SPEC International Conference on Performance Engineering (ICPE 2014). ACM, March 2014.
[17]
J. G. von Kistowski, N. R. Herbst, and S. Kounev. Modeling Variations in Load Intensity over Time. In Proceedings of the 3rd International Workshop on Large-Scale Testing (LT 2014), co-located with the 5th ACM/SPEC International Conference on Performance Engineering (ICPE 2014). ACM, March 2014.

Cited By

View all
  • (2022)A configurable method for benchmarking scalability of cloud-native applicationsEmpirical Software Engineering10.1007/s10664-022-10162-127:6Online publication date: 1-Nov-2022
  • (2021)How to Measure Scalability of Distributed Stream Processing Engines?Companion of the ACM/SPEC International Conference on Performance Engineering10.1145/3447545.3451190(85-88)Online publication date: 19-Apr-2021
  • (2020)Auto-Scaling Provision Basing on Workload Prediction in the Virtualized Data CenterInternational Journal of Grid and High Performance Computing10.4018/IJGHPC.202001010412:1(53-69)Online publication date: Jan-2020
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
HotTopiCS '14: Proceedings of the 2nd International Workshop on Hot Topics in Cloud service Scalability
March 2014
39 pages
ISBN:9781450330596
DOI:10.1145/2649563
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

In-Cooperation

  • SAP
  • SPEC: SPEC Research Group

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 22 March 2014

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Demand
  2. Elasticity
  3. Load Profile
  4. Resource
  5. Supply

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

HotTopiCS '14

Acceptance Rates

Overall Acceptance Rate 10 of 15 submissions, 67%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)5
  • Downloads (Last 6 weeks)2
Reflects downloads up to 05 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2022)A configurable method for benchmarking scalability of cloud-native applicationsEmpirical Software Engineering10.1007/s10664-022-10162-127:6Online publication date: 1-Nov-2022
  • (2021)How to Measure Scalability of Distributed Stream Processing Engines?Companion of the ACM/SPEC International Conference on Performance Engineering10.1145/3447545.3451190(85-88)Online publication date: 19-Apr-2021
  • (2020)Auto-Scaling Provision Basing on Workload Prediction in the Virtualized Data CenterInternational Journal of Grid and High Performance Computing10.4018/IJGHPC.202001010412:1(53-69)Online publication date: Jan-2020
  • (2020)All but oneACM SIGAPP Applied Computing Review10.1145/3429204.342920520:3(5-19)Online publication date: 8-Oct-2020
  • (2020)Memory Elasticity BenchmarkProceedings of the 13th ACM International Systems and Storage Conference10.1145/3383669.3398277(1-12)Online publication date: 30-May-2020
  • (2020)Benchmarking elasticity of FaaS platforms as a foundation for objective-driven design of serverless applicationsProceedings of the 35th Annual ACM Symposium on Applied Computing10.1145/3341105.3373948(1576-1585)Online publication date: 30-Mar-2020
  • (2017)ElaScriptProceedings of the Symposium on Applied Computing10.1145/3019612.3019631(392-398)Online publication date: 3-Apr-2017
  • (2016)A survey on evaluating elasticity of cloud computing platform2016 World Automation Congress (WAC)10.1109/WAC.2016.7583052(1-4)Online publication date: Jul-2016
  • (2016)Elasticity Evaluation of IaaS Cloud Based on Mixed Workloads2016 15th International Symposium on Parallel and Distributed Computing (ISPDC)10.1109/ISPDC.2016.28(157-164)Online publication date: 2016
  • (2016)Joint-analysis of performance and energy consumption when enabling cloud elasticity for synchronous HPC applicationsConcurrency and Computation: Practice & Experience10.1002/cpe.371028:5(1548-1571)Online publication date: 10-Apr-2016
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media