skip to main content
10.1145/3030207.3030242acmconferencesArticle/Chapter ViewAbstractPublication PagesicpeConference Proceedingsconference-collections
research-article

Many Flies in One Swat: Automated Categorization of Performance Problem Diagnosis Results

Published: 17 April 2017 Publication History

Abstract

As the importance of application performance grows in modern enterprise systems, many organizations employ application performance management (APM) tools to help them deal with potential performance problems during production. In addition to monitoring capabilities, these tools provide problem detection and alerting. In large enterprise systems these tools can report a very large number of performance problems. They have to be dealt with individually, in a time-consuming and error-prone manual process, even though many of them have a common root cause. In this vision paper, we propose using automatic categorization for dealing with large numbers of performance problems reported by APM tools. This leads to the aggregation of reported problems, reducing the work required for resolving them. Additionally, our approach opens the possibility of extending the analysis approaches to use this information for a more efficient diagnosis of performance problems.

References

[1]
V. L. Brailovsky. A probabilistic approach to clustering. Pattern Recogn. Lett., 12(4):193--198, 1991.
[2]
K. C. Foo. Automated discovery of performance regressions in enterprise applications, 2011.
[3]
C. Haight and F. D. Silva. Gartner's magic quadrant for application performance monitoring suites, 2016.
[4]
M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann, and I. H. Witten. The weka data mining software: An update. SIGKDD Explor. Newsl., 11(1):10--18, 2009.
[5]
C. Heger, A. van Hoorn, D. Okanović, S. Siegl, and A. Wert. Expert-guided automatic diagnosis of performance problems in enterprise applications. In Proc.\ 12th Europ.\ Dependable Computing Conf.\ (EDCC '16), 2016.
[6]
A. K. Jain, M. N. Murty, and P. J. Flynn. Data clustering: A review. ACM Comput. Surv., 31(3):264--323, 1999.
[7]
Z. M. Jiang. Automated analysis of load testing results. In Proc. 19th Int. Symposium on Soft. Testing and Analysis (ISSTA '10), pages 143--146, 2010.
[8]
H. Malik, H. Hemmati, and A. E. Hassan. Automatic detection of performance deviations in the load testing of large scale systems. In Proc. 35th Int. Conf. on Soft. Eng. (ICSE '13), pages 1012--1021, 2013.
[9]
H. Malik, Z. M. Jiang, B. Adams, A. E. Hassan, P. Flora, and G. Hamann. Automatic comparison of load tests to support the performance analysis of large enterprise systems. In Proc. 14th European Conf. on Soft. Maintenance and Reengineering (CSMR '10), pages 222--231, 2010.
[10]
A. Nistor, P.-C. Chang, C. Radoi, and S. Lu. Caramel: Detecting and fixing performance problems that have non-intrusive fixes. In 37th Int. Conf. on Soft. Engineering (ICSE '15), pages 902--912, 2015.
[11]
D. Okanović, A. van Hoorn, C. Heger, A. Wert, and S. Siegl. Towards performance tooling interoperability: An open format for representing execution traces. In Proc. of the 13th European Workshop on Perf. Engineering (EPEW '16), pages 94--108, 2016.
[12]
L. Rokach and O. Maimon. Clustering methods. In Data Mining and Knowledge Discovery Handbook, pages 321--352. Springer, 2005.
[13]
C. U. Smith and L. G. Williams. Software performance antipatterns. In Proc. 2nd Int. Workshop on Soft. and Performance (WOSP '00), pages 127--136, 2000.
[14]
R. Tibshirani, G. Walther, and T. Hastie. Estimating the number of clusters in a dataset via the gap statistic. Journal of the Royal Statistical Society Series B (Statistical Methodology), 63:411--423, 2000.
[15]
A. Wert, J. Happe, and L. Happe. Supporting swift reaction: Automatically uncovering performance problems by systematic experiments. In Proc. 2013 Int. Conf.on Soft. Engineering (ICSE '13), pages 552--561, 2013.
[16]
C. T. Zahn. Graph-theoretical methods for detecting and describing gestalt clusters. IEEE Trans. Comput., 20(1):68--86, 1971.\\

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICPE '17: Proceedings of the 8th ACM/SPEC on International Conference on Performance Engineering
April 2017
450 pages
ISBN:9781450344043
DOI:10.1145/3030207
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 17 April 2017

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. application performance management
  2. categorization
  3. cluster analysis

Qualifiers

  • Research-article

Funding Sources

  • German Federal Ministry of Education and Research
  • Serbian Ministry of Education

Conference

ICPE '17
Sponsor:

Acceptance Rates

ICPE '17 Paper Acceptance Rate 27 of 83 submissions, 33%;
Overall Acceptance Rate 252 of 851 submissions, 30%

Upcoming Conference

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 89
    Total Downloads
  • Downloads (Last 12 months)5
  • Downloads (Last 6 weeks)0
Reflects downloads up to 23 Feb 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media