skip to main content
10.1145/2479871.2479879acmconferencesArticle/Chapter ViewAbstractPublication PagesicpeConference Proceedingsconference-collections
research-article

Automated root cause isolation of performance regressions during software development

Published:21 April 2013Publication History

ABSTRACT

Performance is crucial for the success of an application. To build responsive and cost efficient applications, software engineers must be able to detect and fix performance problems early in the development process. Existing approaches are either relying on a high level of abstraction such that critical problems cannot be detected or require high manual effort. In this paper, we present a novel approach that integrates performance regression root cause analysis into the existing development infrastructure using performance-aware unit tests and the revision history. Our approach is easy to use and provides software engineers immediate insights with automated root cause analysis. In a realistic case study based on the change history of Apache Commons Math, we demonstrate that our approach can automatically detect and identify the root cause of a major performance regression.

References

  1. Commons math: The apache commons mathematics library, 2012. http://commons.apache.org/math.Google ScholarGoogle Scholar
  2. Contiperf 2, 2012. http://databene.org/contiperf.html.Google ScholarGoogle Scholar
  3. Fighting regressions with git bisect manual page, 2012. www.git-scm.com/docs/git-bisect-lk2009.html.Google ScholarGoogle Scholar
  4. Git, 2012. http://git-scm.com.Google ScholarGoogle Scholar
  5. Java matrix benchmark, 2012. http://code.google.com/p/java-matrix-benchmark.Google ScholarGoogle Scholar
  6. Javassist, 2012. http://www.csg.is.titech.ac.jp/~chiba/javassist.Google ScholarGoogle Scholar
  7. Junit: A programmer-oriented testing framework for java, 2012. http://kentbeck.github.com/junit.Google ScholarGoogle Scholar
  8. Junitperf, 2012. www.clarkware.com/software/JUnitPerf.html.Google ScholarGoogle Scholar
  9. Subversion, 2012. http://subversion.apache.org.Google ScholarGoogle Scholar
  10. B. W. Boehm. Software Engineering Economics. Prentice-Hall Advances in Computing Science & Technology Series. Prentice-Hall, Englewood Cliffs, NJ, 1981. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. B. W. Boehm. Software engineering economics. Software Engineering, IEEE Transactions on, SE-10(1):4--21, Jan. 1984. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. L. Bulej, T. Bureš, J. Keznikl, A. Koubková, A. Podzimek, and P. Tuma. Capturing performance assumptions using stochastic performance logic. In Proceedings of the third joint WOSP/SIPEW international conference on Performance Engineering, ICPE '12, pages 311--322, New York, NY, USA, 2012. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. L. Bulej, T. Kalibera, and P. Tma. Repeated results analysis for middleware regression benchmarking. Perform. Eval., 60(1--4):345--358, May 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. S. Chiba and M. Nishizawa. An easy-to-use toolkit for efficient java bytecode translators. In Proceedings of the 2nd international conference on Generative programming and component engineering, GPCE '03, pages 364--376, New York, NY, USA, 2003. Springer-Verlag New York, Inc. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. J. Davison de St. Germain, A. Morris, S. G. Parker, A. D. Malony, and S. Shende. Performance analysis integration in the uintah software development cycle. International Journal of Parallel Programming, 31:35--53, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. J. de St. Germain, A. Morris, S. Parker, A. Malony, and S. Shende. Integrating performance analysis in the uintah software development cycle. In H. Zima, K. Joe, M. Sato, Y. Seo, and M. Shimasaki, editors, High Performance Computing, volume 2327 of Lecture Notes in Computer Science, pages 305--308. Springer Berlin / Heidelberg, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. J. Ehlers, A. van Hoorn, J. Waller, and W. Hasselbring. Self-adaptive software system monitoring for performance anomaly localization. In Proceedings of the 8th ACM international conference on Autonomic computing, ICAC '11, pages 197--200, New York, NY, USA, 2011. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. K. C. Foo, Z. M. Jiang, B. Adams, A. E. Hassan, Y. Zou, and P. Flora. Mining performance regression testing repositories for automated performance analysis. In Proceedings of the 2010 10th International Conference on Quality Software, QSIC '10, pages 32--41, Washington, DC, USA, 2010. IEEE Computer Society. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. G. Fraser and A. Zeller. Generating parameterized unit tests. In Proceedings of the 2011 International Symposium on Software Testing and Analysis, ISSTA '11, pages 364--374, New York, NY, USA, 2011. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. A. Georges, D. Buytaert, and L. Eeckhout. Statistically rigorous java performance evaluation. SIGPLAN Not., 42:57--76, October 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. M. Grechanik, C. Fu, and Q. Xie. Automatically finding performance problems with feedback-directed learning software testing. In Software Engineering (ICSE), 2012 34th International Conference on, pages 156--166, june 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Z. M. Jiang, A. E. Hassan, G. Hamann, and P. Flora. Automated performance analysis of load tests. Software Maintenance, IEEE International Conference on, 0:125--134, 2009.Google ScholarGoogle Scholar
  23. M. Jovic, A. Adamoli, and M. Hauswirth. Catch me if you can: performance bug detection in the wild. In Proceedings of the 2011 ACM international conference on Object oriented programming systems languages and applications, OOPSLA '11, pages 155--170, New York, NY, USA, 2011. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. H. Koziolek, S. Becker, and J. Happe. Predicting the performance of component-based software architectures with different usage profiles. In S. Overhage, C. Szyperski, R. Reussner, and J. Stafford, editors, Software Architectures, Components, and Applications, volume 4880 of Lecture Notes in Computer Science, pages 145--163. Springer Berlin Heidelberg, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. D. Lee, S. Cha, and A. Lee. A performance anomaly detection and analysis framework for dbms development. Knowledge and Data Engineering, IEEE Transactions on, 24(8):1345--1360, aug. 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. B. P. Miller, M. D. Callaghan, J. M. Cargille, J. K. Hollingsworth, R. B. Irvin, K. L. Karavanic, K. Kunchithapadam, and T. Newhall. The paradyn parallel performance measurement tool. Computer, 28(11):37--46, Nov. 1995. Google ScholarGoogle ScholarDigital LibraryDigital Library
  27. A. V. Mirgorodskiy and B. P. Miller. Diagnosing distributed systems with self-propelled instrumentation. In Proceedings of the 9th ACM/IFIP/USENIX International Conference on Middleware, Middleware '08, pages 82--103, New York, NY, USA, 2008. Springer-Verlag New York, Inc. Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. N. Mostafa and C. Krintz. Tracking performance across software revisions. In Proceedings of the 7th International Conference on Principles and Practice of Programming in Java, PPPJ '09, pages 162--171, New York, NY, USA, 2009. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  29. T. H. Nguyen, B. Adams, Z. M. Jiang, A. E. Hassan, M. Nasser, and P. Flora. Automated detection of performance regressions using statistical process control techniques. In Proceedings of the third joint WOSP/SIPEW international conference on Performance Engineering, ICPE '12, pages 299--310, New York, NY, USA, 2012. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. T. Parsons and J. Murphy. Detecting performance antipatterns in component based enterprise systems. Journal of Object Technology, 7(3):55--90, 4 2008.Google ScholarGoogle ScholarCross RefCross Ref
  31. M. Rohr, A. van Hoorn, J. Matevska, N. Sommer, L. Stoever, S. Giesecke, and W. Hasselbring. Kieker: continuous monitoring and on demand visualization of java software behavior. In Proceedings of the IASTED International Conference on Software Engineering, SE'08, pages 80--85, Anaheim, CA, USA, 2008. ACTA Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  32. C. Smith. Software performance engineering. Performance Evaluation of Computer and Communication Systems, pages 509--536, 1993. Google ScholarGoogle ScholarDigital LibraryDigital Library
  33. D. Yan, G. Xu, and A. Rountev. Uncovering performance problems in java applications with reference propagation profiling. In Proceedings of the 2012 International Conference on Software Engineering, ICSE 2012, pages 134--144, Piscataway, NJ, USA, 2012. IEEE Press. Google ScholarGoogle ScholarDigital LibraryDigital Library
  34. S. Zaman, B. Adams, and A. Hassan. A qualitative study on performance bugs. In Mining Software Repositories (MSR), 2012 9th IEEE Working Conference on, pages 199--208, june 2012.Google ScholarGoogle ScholarCross RefCross Ref
  35. S. Zaman, B. Adams, and A. E. Hassan. Security versus performance bugs: a case study on firefox. In Proceedings of the 8th Working Conference on Mining Software Repositories, MSR '11, pages 93--102, New York, NY, USA, 2011. ACM. Google ScholarGoogle ScholarDigital LibraryDigital Library
  36. E. Zimran and D. Butchart. Performance engineering throughout the product life cycle. In CompEuro '93. 'Computers in Design, Manufacturing, and Production', Proceedings., pages 344--349, may 1993.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Automated root cause isolation of performance regressions during software development

          Recommendations

          Comments

          Login options

          Check if you have access through your login credentials or your institution to get full access on this article.

          Sign in
          • Published in

            cover image ACM Conferences
            ICPE '13: Proceedings of the 4th ACM/SPEC International Conference on Performance Engineering
            April 2013
            446 pages
            ISBN:9781450316361
            DOI:10.1145/2479871

            Copyright © 2013 ACM

            Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

            Publisher

            Association for Computing Machinery

            New York, NY, United States

            Publication History

            • Published: 21 April 2013

            Permissions

            Request permissions about this article.

            Request Permissions

            Check for updates

            Qualifiers

            • research-article

            Acceptance Rates

            ICPE '13 Paper Acceptance Rate28of64submissions,44%Overall Acceptance Rate252of851submissions,30%

            Upcoming Conference

          PDF Format

          View or Download as a PDF file.

          PDF

          eReader

          View online with eReader.

          eReader