skip to main content
10.1145/3185768.3186404acmconferencesArticle/Chapter ViewAbstractPublication PagesicpeConference Proceedingsconference-collections
research-article

How to Detect Performance Changes in Software History: Performance Analysis of Software System Versions

Published: 02 April 2018 Publication History

Abstract

Source code changes can affect the performance of software. Structured knowledge about classes of those changes could guide software developers in avoiding negative changes and improving the performance by positive changes. Neither a comprehensive overview nor a mature method for structured detection of those changes exists for this purpose. We address this research challenge by presenting Performance Analysis of Software Systems (PeASS). PeASS builds up a comprehensive knowledge base of changes affecting the performance of a software by analyzing the version history of a repository using its unit tests. It is based on a method for determining the significant performance changes between two unit tests by measurement and statistical analysis. Furthermore, PeASS uses regression test selection for saving measurement time and root cause isolation method for performance changes analysis. We demonstrate our methodology in the context of Java by analyzing the versions of Apache Commons IO.

References

[1]
J. P. S. Alcocer and A. Bergel. Tracking down performance variation against source code evolution. In Proceedings of the 11th Symposium on Dynamic Languages, DLS 2015, pages 129-139, New York, NY, USA, 2015. ACM.
[2]
E. Barrett, C. F. Bolz-Tereick, R. Killick, S. Mount, and L. Tratt. Virtual machine warmup blows hot and cold. Proceedings of the ACM on Programming Languages, 1(OOPSLA):52, 2017.
[3]
S. Becker, H. Koziolek, and R. Reussner. The palladio component model for model-driven performance prediction. Journal of Systems and Software, 82(1):3-22, 2009.
[4]
L. Bulej, T. Bureš, V. Horký, J. Kotrč, L. Marek, T. Trojánek, and P. Tuma. Unit testing performance with stochastic performance logic. Automated Software Engineering, 24(1):139-187, Mar 2017.
[5]
J. Chen and W. Shang. An exploratory study of performance regression introducing code changes. In Software Maintenance and Evolution (ICSME), 2017 IEEE International Conference on, pages 341-352. IEEE, 2017.
[6]
A. Georges, D. Buytaert, and L. Eeckhout. Statistically rigorous java performance evaluation. ACM SIGPLAN Notices, 42(10):57-76, 2007.
[7]
C. Heger, J. Happe, and R. Farahbod. Automated root cause isolation of performance regressions during software development. In ICPE 13, pages 27-38, New York, USA, 2013. ACM.
[8]
A. Hindle, A. Wilson, K. Rasmussen, E. J. Barlow, J. C. Campbell, and S. Romansky. Greenminer: A hardware based mining software repositories software energy consumption framework. In MSR 2014, pages 12-21, New York, USA, 2014. ACM.
[9]
G. Jin, L. Song, X. Shi, J. Scherpelz, and S. Lu. Understanding and detecting real-world performance bugs. In Proceedings of the 33rd ACM SIGPLAN PLDI, PLDI '12, pages 77-88, New York, USA, 2012. ACM.
[10]
T. Kalibera and R. Jones. Rigorous benchmarking in reasonable time. In ACM SIGPLAN Notices, volume 48, pages 63-74. ACM, 2013.
[11]
C. Laaber and P. Leitner. (h, g)opper: Performance history mining and analysis. In Proceedings of the 8th ACM/SPEC on International Conference on Performance Engineering, ICPE '17, pages 167-168, New York, NY, USA, 2017. ACM.
[12]
Y. Liu, C. Xu, and S.-C. Cheung. Characterizing and detecting performance bugs for smartphone applications. In Proceedings of the 36th ICPE, pages 1013-1024. ACM, 2014.
[13]
Q. Luo, D. Poshyvanyk, and M. Grechanik. Mining performance regression inducing code changes in evolving software. In Proceedings of the 13th International Conference on Mining Software Repositories, pages 25-36. ACM, 2016.
[14]
T. H. Nguyen, B. Adams, Z. M. Jiang, A. E. Hassan, M. Nasser, and P. Flora. Automated detection of performance regressions using statistical process control techniques. In ICPE, pages 299-310, New York, USA, 2012. ACM.
[15]
A. Nistor, T. Jiang, and L. Tan. Discovering, reporting, and fixing performance bugs. In MSR 2013, pages 237-246. IEEE Press, 2013.
[16]
M. Pradel, M. Huggler, and T. R. Gross. Performance regression testing of concurrent classes. In Proceedings of the 2014 International Symposium on Software Testing and Analysis, pages 13-25. ACM, 2014.
[17]
D. G. Reichelt and L. Braubach. Sicherstellung von performanzeigenschaften durch kontinuierliche performanztests mit dem kopeme framework. In Software Engineering, pages 119-124, 2014.
[18]
J. P. Sandoval Alcocer, A. Bergel, and M. T. Valente. Learning from source code history to identify performance failures. In Proceedings of the 7th ACM/SPEC on International Conference on Performance Engineering, ICPE '16, pages 37-48, New York, NY, USA, 2016. ACM.
[19]
M. Selakovic and M. Pradel. Performance issues and optimizations in javascript: an empirical study. In Proceedings of the 38th International Conference on Software Engineering, pages 61-72. ACM, 2016.
[20]
P. Stefan, V. Horky, L. Bulej, and P. Tuma. Unit testing performance in java projects: Are we there yet? In Proceedings of ACM/SPEC ICPE 2017, pages 401-412. ACM, 2017.
[21]
A. van Hoorn, J. Waller, and W. Hasselbring. Kieker: A framework for application performance monitoring and dynamic software analysis. In Proceedings of the 3rd joint ACM/SPEC International Conference on Performance Engineering (ICPE 2012), pages 247-248. ACM, April 2012.
[22]
S. Zaman, B. Adams, and A. E. Hassan. Security versus performance bugs: a case study on firefox. In MSR 2011, pages 93-102. ACM, 2011.

Cited By

View all
  • (2025)Performance regression testing initiativesInformation and Software Technology10.1016/j.infsof.2024.107641179:COnline publication date: 1-Mar-2025
  • (2023)Performance evolution of configurable software systems: an empirical studyEmpirical Software Engineering10.1007/s10664-023-10338-328:6Online publication date: 13-Nov-2023
  • (2022)FADATestProceedings of the 44th International Conference on Software Engineering10.1145/3510003.3510169(896-908)Online publication date: 21-May-2022
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICPE '18: Companion of the 2018 ACM/SPEC International Conference on Performance Engineering
April 2018
212 pages
ISBN:9781450356299
DOI:10.1145/3185768
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 02 April 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. mining software repositories
  2. performance testing

Qualifiers

  • Research-article

Conference

ICPE '18

Acceptance Rates

Overall Acceptance Rate 252 of 851 submissions, 30%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)28
  • Downloads (Last 6 weeks)1
Reflects downloads up to 07 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Performance regression testing initiativesInformation and Software Technology10.1016/j.infsof.2024.107641179:COnline publication date: 1-Mar-2025
  • (2023)Performance evolution of configurable software systems: an empirical studyEmpirical Software Engineering10.1007/s10664-023-10338-328:6Online publication date: 13-Nov-2023
  • (2022)FADATestProceedings of the 44th International Conference on Software Engineering10.1145/3510003.3510169(896-908)Online publication date: 21-May-2022
  • (2022)Automated Identification of Performance Changes at Code Level2022 IEEE 22nd International Conference on Software Quality, Reliability and Security (QRS)10.1109/QRS57517.2022.00096(916-925)Online publication date: Dec-2022
  • (2020)Performance mutation testingSoftware Testing, Verification and Reliability10.1002/stvr.172831:5Online publication date: 29-Jan-2020
  • (2019)Analyzing performance-aware code changes in software development processProceedings of the 27th International Conference on Program Comprehension10.1109/ICPC.2019.00049(300-310)Online publication date: 25-May-2019
  • (2019)PeASSProceedings of the 34th IEEE/ACM International Conference on Automated Software Engineering10.1109/ASE.2019.00123(1146-1149)Online publication date: 10-Nov-2019
  • (2019)Accurate modeling of performance histories for evolving software systemsProceedings of the 34th IEEE/ACM International Conference on Automated Software Engineering10.1109/ASE.2019.00065(640-652)Online publication date: 10-Nov-2019
  • (2019)DYNAMOJM: A JMeter Tool for Performance Testing Using Dynamic Workload AdaptationTesting Software and Systems10.1007/978-3-030-31280-0_14(234-241)Online publication date: 8-Oct-2019
  • (2019)Towards an Efficient Performance Testing Through Dynamic Workload AdaptationTesting Software and Systems10.1007/978-3-030-31280-0_13(215-233)Online publication date: 8-Oct-2019

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media