skip to main content
research-article

Including Performance Benchmarks into Continuous Integration to Enable DevOps

Published: 03 April 2015 Publication History

Abstract

The DevOps movement intends to improve communication, collaboration, and integration between software developers (Dev) and IT operations professionals (Ops). Automation of software quality assurance is key to DevOps success. We present how automated performance benchmarks may be included into continuous integration. As an example, we report on regression benchmarks for application monitoring frameworks and illustrate the inclusion of automated benchmarks into continuous integration setups.

References

[1]
J. Ehlers, A. van Hoorn, J. Waller, and W. Hasselbring. Selfadaptive software system monitoring for performance anomaly localization. In Proceedings of the 8th IEEE/ACM International Conference on Autonomic Computing (ICAC 2011), pages 197--200. ACM, June 2011.
[2]
T. Kalibera, J. Lehotsky, D. Majda, B. Repcek, M. Tomcanyi, A. Tomecek, P. T?uma, and J. Urban. Automated benchmarking and analysis tool. In Proceedings of the 1st International Conference on Performance Evaluation Methodologies and Tools (Valuetools 2006), pages 5--14. ACM, Oct. 2006.
[3]
M. Meyer. Continuous integration and its tools. IEEE Software, 31(3):14--16, May 2014.
[4]
S. E. Sim, S. Easterbrook, and R. C. Holt. Using benchmarking to advance research: A challenge to software engineering. In Proceedings of the 25th International Conference on Software Engineering (ICSE 2003), pages 74--83. IEEE Computer Society, May 2003.
[5]
A. van Hoorn, J. Waller, and W. Hasselbring. Kieker: A framework for application performance monitoring and dynamic software analysis. In Proceedings of the 3rd ACM/SPEC International Conference on Performance Engineering (ICPE 2012), pages 247--248. ACM, Apr. 2012.
[6]
J. Waller and W. Hasselbring. A benchmark engineering methodology to measure the overhead of application-level monitoring. In Proceedings of the Symposium on Software Performance: Joint Kieker/Palladio Days (KPDays 2013), pages 59--68. CEUR Workshop Proceedings, Nov. 2013.
[7]
C. Weiss, D. Westermann, C. Heger, and M. Moser. Systematic performance evaluation based on tailored benchmark applications. In Proceedings of the 4th ACM/SPEC International Conference on Performance Engineering (ICPE 2013), pages 411--420. ACM, Apr. 2013.

Cited By

View all
  • (2024)Deployment and Analysis of Microservice-Based Application in Different Cloud EnvironmentSSRN Electronic Journal10.2139/ssrn.4492378Online publication date: 2024
  • (2024)Assessing BizDevOps maturity using international standardsJournal of Software: Evolution and Process10.1002/smr.264636:8Online publication date: 5-Aug-2024
  • (2023)Towards Solving the Challenge of Minimal Overhead MonitoringCompanion of the 2023 ACM/SPEC International Conference on Performance Engineering10.1145/3578245.3584851(381-388)Online publication date: 15-Apr-2023
  • Show More Cited By

Comments

Information & Contributors

Information

Published In

cover image ACM SIGSOFT Software Engineering Notes
ACM SIGSOFT Software Engineering Notes  Volume 40, Issue 2
March 2015
125 pages
ISSN:0163-5948
DOI:10.1145/2735399
Issue’s Table of Contents

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 03 April 2015
Published in SIGSOFT Volume 40, Issue 2

Check for updates

Author Tags

  1. Jenkins
  2. Kieker
  3. MooBench

Qualifiers

  • Research-article

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)43
  • Downloads (Last 6 weeks)6
Reflects downloads up to 08 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Deployment and Analysis of Microservice-Based Application in Different Cloud EnvironmentSSRN Electronic Journal10.2139/ssrn.4492378Online publication date: 2024
  • (2024)Assessing BizDevOps maturity using international standardsJournal of Software: Evolution and Process10.1002/smr.264636:8Online publication date: 5-Aug-2024
  • (2023)Towards Solving the Challenge of Minimal Overhead MonitoringCompanion of the 2023 ACM/SPEC International Conference on Performance Engineering10.1145/3578245.3584851(381-388)Online publication date: 15-Apr-2023
  • (2023)Automated Detection of Software Performance Antipatterns in Java-Based ApplicationsIEEE Transactions on Software Engineering10.1109/TSE.2023.323432149:4(2873-2891)Online publication date: 1-Apr-2023
  • (2023)An HPC-Container Based Continuous Integration Tool for Detecting Scaling and Performance Issues in HPC ApplicationsIEEE Transactions on Services Computing10.1109/TSC.2023.3337662(1-12)Online publication date: 2023
  • (2022)Decision-Making Taxonomy of DevOps Success Factors Using Preference Ranking Organization Method of Enrichment EvaluationMathematical Problems in Engineering10.1155/2022/26001602022(1-15)Online publication date: 10-Jan-2022
  • (2022)Using Microbenchmark Suites to Detect Application Performance ChangesIEEE Transactions on Cloud Computing10.1109/TCC.2022.3217947(1-18)Online publication date: 2022
  • (2022)Automated Identification of Performance Changes at Code Level2022 IEEE 22nd International Conference on Software Quality, Reliability and Security (QRS)10.1109/QRS57517.2022.00096(916-925)Online publication date: Dec-2022
  • (2022)Making the Cloud Monitor Real-Time Adaptive2022 IEEE Cloud Summit10.1109/CloudSummit54781.2022.00017(69-74)Online publication date: Oct-2022
  • (2021)Using application benchmark call graphs to quantify and improve the practical relevance of microbenchmark suitesPeerJ Computer Science10.7717/peerj-cs.5487(e548)Online publication date: 28-May-2021
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media