ABSTRACT
Performance is crucial for the success of an application. To build responsive and cost efficient applications, software engineers must be able to detect and fix performance problems early in the development process. Existing approaches are either relying on a high level of abstraction such that critical problems cannot be detected or require high manual effort. In this paper, we present a novel approach that integrates performance regression root cause analysis into the existing development infrastructure using performance-aware unit tests and the revision history. Our approach is easy to use and provides software engineers immediate insights with automated root cause analysis. In a realistic case study based on the change history of Apache Commons Math, we demonstrate that our approach can automatically detect and identify the root cause of a major performance regression.
- Commons math: The apache commons mathematics library, 2012. http://commons.apache.org/math.Google Scholar
- Contiperf 2, 2012. http://databene.org/contiperf.html.Google Scholar
- Fighting regressions with git bisect manual page, 2012. www.git-scm.com/docs/git-bisect-lk2009.html.Google Scholar
- Git, 2012. http://git-scm.com.Google Scholar
- Java matrix benchmark, 2012. http://code.google.com/p/java-matrix-benchmark.Google Scholar
- Javassist, 2012. http://www.csg.is.titech.ac.jp/~chiba/javassist.Google Scholar
- Junit: A programmer-oriented testing framework for java, 2012. http://kentbeck.github.com/junit.Google Scholar
- Junitperf, 2012. www.clarkware.com/software/JUnitPerf.html.Google Scholar
- Subversion, 2012. http://subversion.apache.org.Google Scholar
- B. W. Boehm. Software Engineering Economics. Prentice-Hall Advances in Computing Science & Technology Series. Prentice-Hall, Englewood Cliffs, NJ, 1981. Google ScholarDigital Library
- B. W. Boehm. Software engineering economics. Software Engineering, IEEE Transactions on, SE-10(1):4--21, Jan. 1984. Google ScholarDigital Library
- L. Bulej, T. Bureš, J. Keznikl, A. Koubková, A. Podzimek, and P. Tuma. Capturing performance assumptions using stochastic performance logic. In Proceedings of the third joint WOSP/SIPEW international conference on Performance Engineering, ICPE '12, pages 311--322, New York, NY, USA, 2012. ACM. Google ScholarDigital Library
- L. Bulej, T. Kalibera, and P. Tma. Repeated results analysis for middleware regression benchmarking. Perform. Eval., 60(1--4):345--358, May 2005. Google ScholarDigital Library
- S. Chiba and M. Nishizawa. An easy-to-use toolkit for efficient java bytecode translators. In Proceedings of the 2nd international conference on Generative programming and component engineering, GPCE '03, pages 364--376, New York, NY, USA, 2003. Springer-Verlag New York, Inc. Google ScholarDigital Library
- J. Davison de St. Germain, A. Morris, S. G. Parker, A. D. Malony, and S. Shende. Performance analysis integration in the uintah software development cycle. International Journal of Parallel Programming, 31:35--53, 2003. Google ScholarDigital Library
- J. de St. Germain, A. Morris, S. Parker, A. Malony, and S. Shende. Integrating performance analysis in the uintah software development cycle. In H. Zima, K. Joe, M. Sato, Y. Seo, and M. Shimasaki, editors, High Performance Computing, volume 2327 of Lecture Notes in Computer Science, pages 305--308. Springer Berlin / Heidelberg, 2006. Google ScholarDigital Library
- J. Ehlers, A. van Hoorn, J. Waller, and W. Hasselbring. Self-adaptive software system monitoring for performance anomaly localization. In Proceedings of the 8th ACM international conference on Autonomic computing, ICAC '11, pages 197--200, New York, NY, USA, 2011. ACM. Google ScholarDigital Library
- K. C. Foo, Z. M. Jiang, B. Adams, A. E. Hassan, Y. Zou, and P. Flora. Mining performance regression testing repositories for automated performance analysis. In Proceedings of the 2010 10th International Conference on Quality Software, QSIC '10, pages 32--41, Washington, DC, USA, 2010. IEEE Computer Society. Google ScholarDigital Library
- G. Fraser and A. Zeller. Generating parameterized unit tests. In Proceedings of the 2011 International Symposium on Software Testing and Analysis, ISSTA '11, pages 364--374, New York, NY, USA, 2011. ACM. Google ScholarDigital Library
- A. Georges, D. Buytaert, and L. Eeckhout. Statistically rigorous java performance evaluation. SIGPLAN Not., 42:57--76, October 2007. Google ScholarDigital Library
- M. Grechanik, C. Fu, and Q. Xie. Automatically finding performance problems with feedback-directed learning software testing. In Software Engineering (ICSE), 2012 34th International Conference on, pages 156--166, june 2012. Google ScholarDigital Library
- Z. M. Jiang, A. E. Hassan, G. Hamann, and P. Flora. Automated performance analysis of load tests. Software Maintenance, IEEE International Conference on, 0:125--134, 2009.Google Scholar
- M. Jovic, A. Adamoli, and M. Hauswirth. Catch me if you can: performance bug detection in the wild. In Proceedings of the 2011 ACM international conference on Object oriented programming systems languages and applications, OOPSLA '11, pages 155--170, New York, NY, USA, 2011. ACM. Google ScholarDigital Library
- H. Koziolek, S. Becker, and J. Happe. Predicting the performance of component-based software architectures with different usage profiles. In S. Overhage, C. Szyperski, R. Reussner, and J. Stafford, editors, Software Architectures, Components, and Applications, volume 4880 of Lecture Notes in Computer Science, pages 145--163. Springer Berlin Heidelberg, 2007. Google ScholarDigital Library
- D. Lee, S. Cha, and A. Lee. A performance anomaly detection and analysis framework for dbms development. Knowledge and Data Engineering, IEEE Transactions on, 24(8):1345--1360, aug. 2012. Google ScholarDigital Library
- B. P. Miller, M. D. Callaghan, J. M. Cargille, J. K. Hollingsworth, R. B. Irvin, K. L. Karavanic, K. Kunchithapadam, and T. Newhall. The paradyn parallel performance measurement tool. Computer, 28(11):37--46, Nov. 1995. Google ScholarDigital Library
- A. V. Mirgorodskiy and B. P. Miller. Diagnosing distributed systems with self-propelled instrumentation. In Proceedings of the 9th ACM/IFIP/USENIX International Conference on Middleware, Middleware '08, pages 82--103, New York, NY, USA, 2008. Springer-Verlag New York, Inc. Google ScholarDigital Library
- N. Mostafa and C. Krintz. Tracking performance across software revisions. In Proceedings of the 7th International Conference on Principles and Practice of Programming in Java, PPPJ '09, pages 162--171, New York, NY, USA, 2009. ACM. Google ScholarDigital Library
- T. H. Nguyen, B. Adams, Z. M. Jiang, A. E. Hassan, M. Nasser, and P. Flora. Automated detection of performance regressions using statistical process control techniques. In Proceedings of the third joint WOSP/SIPEW international conference on Performance Engineering, ICPE '12, pages 299--310, New York, NY, USA, 2012. ACM. Google ScholarDigital Library
- T. Parsons and J. Murphy. Detecting performance antipatterns in component based enterprise systems. Journal of Object Technology, 7(3):55--90, 4 2008.Google ScholarCross Ref
- M. Rohr, A. van Hoorn, J. Matevska, N. Sommer, L. Stoever, S. Giesecke, and W. Hasselbring. Kieker: continuous monitoring and on demand visualization of java software behavior. In Proceedings of the IASTED International Conference on Software Engineering, SE'08, pages 80--85, Anaheim, CA, USA, 2008. ACTA Press. Google ScholarDigital Library
- C. Smith. Software performance engineering. Performance Evaluation of Computer and Communication Systems, pages 509--536, 1993. Google ScholarDigital Library
- D. Yan, G. Xu, and A. Rountev. Uncovering performance problems in java applications with reference propagation profiling. In Proceedings of the 2012 International Conference on Software Engineering, ICSE 2012, pages 134--144, Piscataway, NJ, USA, 2012. IEEE Press. Google ScholarDigital Library
- S. Zaman, B. Adams, and A. Hassan. A qualitative study on performance bugs. In Mining Software Repositories (MSR), 2012 9th IEEE Working Conference on, pages 199--208, june 2012.Google ScholarCross Ref
- S. Zaman, B. Adams, and A. E. Hassan. Security versus performance bugs: a case study on firefox. In Proceedings of the 8th Working Conference on Mining Software Repositories, MSR '11, pages 93--102, New York, NY, USA, 2011. ACM. Google ScholarDigital Library
- E. Zimran and D. Butchart. Performance engineering throughout the product life cycle. In CompEuro '93. 'Computers in Design, Manufacturing, and Production', Proceedings., pages 344--349, may 1993.Google ScholarCross Ref
Index Terms
- Automated root cause isolation of performance regressions during software development
Recommendations
What are Problem Causes of Software Projects? Data of Root Cause Analysis at Four Software Companies
ESEM '11: Proceedings of the 2011 International Symposium on Empirical Software Engineering and MeasurementRoot cause analysis (RCA) is a structured investigation of a problem to detect the causes that need to be prevented. We applied ARCA, an RCA method, to target problems of four medium-sized software companies and collected 648 causes of software ...
Empirical study of root cause analysis of software failure
Root Cause Analysis (RCA) is the process of identifying project issues, correcting them and taking preventive actions to avoid occurrences of such issues in the future. Issues could be variance in schedule, effort, cost, productivity, expected results ...
Root Cause Analysis Using Sequence Alignment and Latent Semantic Indexing
ASWEC '08: Proceedings of the 19th Australian Conference on Software EngineeringAutomatic identification of software faults has enormous practical significance. This requires characterizing program execution behavior. Equally important is the aspect of diagnosing (finding root-cause of) faults encountered. In this article, we ...
Comments