skip to main content
10.1145/1356058.1356060acmconferencesArticle/Chapter ViewAbstractPublication PagescgoConference Proceedingsconference-collections
research-article

Perfdiff: a framework for performance difference analysis in a virtual machine environment

Published: 06 April 2008 Publication History

Abstract

Although applications running on virtual machines, such as Java, can achieve platform independence, performance evaluation and analysis becomes difficult due to extra intermediate layers and the dynamic nature of virtual execution environment.
We present a framework for analyzing performance across multiple runs of a program, possibly in dramatically different execution environments. Our framework is based upon our prior lightweight instrumentation technique for building a calling context tree (CCT) of methods at runtime. We first represent each run of a program by a CCT, annotating its edges and nodes with various performance attributes such as call counts or elapsed times. We then identify components of the CCTs that are topologically identical but with significant performance-attribute differences. Next, the topological differences of two CCTs are identified, while ignoring the performance attributes. Finally, we identify the differences in both topology and performance attributes that can be fed back to the software developers or performance analyzers for further scrutiny.
We have applied our methodology to a number of well-known Java benchmarks and a large J2EE application, using call counters as the performance attribute. Our results indicate that this approach can efficiently and effectively relate differences to a small percentage of nodes on the CCT. We present an iterative framework for program analysis, where topological changes are performed to identify differences in CCTs. For most of the test programs, applying a few topological changes such as deletion, addition, and renaming of nodes - are needed to make any two CCTs from the same program identical, whereas less than 2% of performance-attribute changes are needed to achieve a 90% overlap of any two CCTs in performance attributes, after the two CCTs are topologically matched. We have applied our framework to identify subtle configuration differences for complex server applications.

References

[1]
Colorado bench. http://www-plan.cs.colorado.edu/henkel/projects/colorado bench.
[2]
Java virtual machine profiler interface. http://java.sun.com/j2se/1.4.2/docs/guide/jvmpi/jvmpi.html.
[3]
Glenn Ammons, Thomas Ball, and James R. Larus. Exploiting hardware performance counters with flow and context sensitive profiling. In SIGPLAN Conference on Programming Language Design and Implementation, pages 85--96, 1997.
[4]
Taweesup Apiwattanapong, Alessandro Orso, and Mary Jean Harrold. A differencing algorithm for object-oriented programs. In 19th IEEE International Conference on Automated Software Engineering (ASE'04), pages 2--13. ACM Press, 2004.
[5]
Alberto Apostolico and Zvi Galil. Oxford University Press, 1997.
[6]
M. Arnold and B. Ryder. A framework for reducing the cost of instrumented code. In SIGPLAN Conference on Programming Language Design and Implementation, pages 168--179, 2001.
[7]
M. Arnold and Peter F. Sweeney. Approximating the calling context tree via sampling. IBM Research Report, July 2000.
[8]
Standard Performance Evaluation Corporation. Specjbb2000 java business benchmark. http://www.spec.org/jbb2000.
[9]
Standard Performance Evaluation Corporation. Specjvm98 benchmarks. http://www.spec.org/jvm98.
[10]
P. T. Feller. Value profiling for instructions and memory locations. Masters Thesis CS98-581, University of California San Diego, April 1998.
[11]
Nikola Grcevski, Allan Kielstra, Kevin Stoodley, Mark Stoodley, and Vijay Sundaresan. Java just-in-time compiler and virtual machine improvements for server and middleware applications. In Usenix 3rd Virtual Machine Research and Technology Symposium (VM'04), 2004.
[12]
D. Jackson and D.A. Ladd. Semantic diff: a tool for summarizing the effects of modifications. In Proceedings of the 21st IEEE International Conference on Software Maintenance, pages 243 -- 252. IEEE Press, 1994.
[13]
W. Laski and J. Szermer. Identification of program modifications and its applications in software maintenance. In In Proceedings of the IEEE International Conference on Software Maintenance, pages 282--290. IEEE Press, 1992.
[14]
Eugene W. Myers. An o(nd) difference algorithm and its variations. pages 251--266, 1986.
[15]
Ueli Wahli, Gustavo Garcia Ochoa, Sharad Cocasse, and Markus Muetschard. Websphere version 5.1 application developer 5.1 web services handbook.
[16]
N. Wilde. Faster reuse and maintenance using software reconnaissance. Technical report, 1994.
[17]
Z. Xing and E. Stroulia. Umldiff: An algorithm for object-oriented design differencing. In 20th IEEE International Conference on Automated Software Engineering (ASE'05), pages 54--65. ACM Press, 2005.
[18]
Xiangyu Zhang and Rajiv Gupta. Matching execution histories of program versions. In ESEC/FSE-13: Proceedings of the 10th European software engineering conference held jointly with 13th ACM SIGSOFT international symposium on Foundations of software engineering, pages 197--206. ACM Press, 2005.
[19]
Xiaotong Zhuang, Mauricio J. Serrano, Harold W. Cain, and Jong-Deok Choi. Accurate, efficient, and adaptive calling context profiling. In In Proceedings of the ACM SIGPLAN Conference on Programming Language Design and Implementation (PLDI-06), 2006.

Cited By

View all
  • (2019)Enhancing Commit Graphs with Visual Runtime Clues2019 Working Conference on Software Visualization (VISSOFT)10.1109/VISSOFT.2019.00012(28-32)Online publication date: Sep-2019
  • (2019)Performance Evolution Matrix: Visualizing Performance Variations Along Software Versions2019 Working Conference on Software Visualization (VISSOFT)10.1109/VISSOFT.2019.00009(1-11)Online publication date: Sep-2019
  • (2018)Synthesizing programs that expose performance bottlenecksProceedings of the 2018 International Symposium on Code Generation and Optimization10.1145/3168830(314-326)Online publication date: 24-Feb-2018
  • Show More Cited By

Index Terms

  1. Perfdiff: a framework for performance difference analysis in a virtual machine environment

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    CGO '08: Proceedings of the 6th annual IEEE/ACM international symposium on Code generation and optimization
    April 2008
    235 pages
    ISBN:9781595939784
    DOI:10.1145/1356058
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 06 April 2008

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. performance comparison
    2. virtual machine

    Qualifiers

    • Research-article

    Conference

    CGO '08

    Acceptance Rates

    CGO '08 Paper Acceptance Rate 21 of 66 submissions, 32%;
    Overall Acceptance Rate 312 of 1,061 submissions, 29%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)17
    • Downloads (Last 6 weeks)1
    Reflects downloads up to 05 Mar 2025

    Other Metrics

    Citations

    Cited By

    View all
    • (2019)Enhancing Commit Graphs with Visual Runtime Clues2019 Working Conference on Software Visualization (VISSOFT)10.1109/VISSOFT.2019.00012(28-32)Online publication date: Sep-2019
    • (2019)Performance Evolution Matrix: Visualizing Performance Variations Along Software Versions2019 Working Conference on Software Visualization (VISSOFT)10.1109/VISSOFT.2019.00009(1-11)Online publication date: Sep-2019
    • (2018)Synthesizing programs that expose performance bottlenecksProceedings of the 2018 International Symposium on Code Generation and Optimization10.1145/3168830(314-326)Online publication date: 24-Feb-2018
    • (2017)Diagnosing Performance Variations by Comparing Multi-Level Execution TracesIEEE Transactions on Parallel and Distributed Systems10.1109/TPDS.2016.256739028:2(462-474)Online publication date: 1-Feb-2017
    • (2015)JITProf: pinpointing JIT-unfriendly JavaScript codeProceedings of the 2015 10th Joint Meeting on Foundations of Software Engineering10.1145/2786805.2786831(357-368)Online publication date: 30-Aug-2015
    • (2014)Performance regression testing of concurrent classesProceedings of the 2014 International Symposium on Software Testing and Analysis10.1145/2610384.2610393(13-25)Online publication date: 21-Jul-2014
    • (2013)Tracking performance failures with rizelProceedings of the 2013 International Workshop on Principles of Software Evolution10.1145/2501543.2501549(38-42)Online publication date: 19-Aug-2013
    • (2013)Performance evolution blueprint: Understanding the impact of software evolution on performance2013 First IEEE Working Conference on Software Visualization (VISSOFT)10.1109/VISSOFT.2013.6650523(1-9)Online publication date: Sep-2013
    • (2012)Tracking down software changes responsible for performance lossProceedings of the International Workshop on Smalltalk Technologies10.1145/2448963.2448966(1-7)Online publication date: 28-Aug-2012
    • (2012)Debugging performance failuresProceedings of the 6th Workshop on Dynamic Languages and Applications10.1145/2307196.2307198(1-3)Online publication date: 11-Jun-2012
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media