skip to main content
10.1145/1062455.1062515acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
Article

Main effects screening: a distributed continuous quality assurance process for monitoring performance degradation in evolving software systems

Published: 15 May 2005 Publication History

Abstract

Developers of highly configurable performance-intensive software systems often use a type of in-house performance-oriented "regression testing" to ensure that their modifications have not adversely affected their software's performance across its large configuration space. Unfortunately, time and resource constraints often limit developers to in-house testing of a small number of configurations and unreliable extrapolation from these results to the entire configuration space, which allows many performance bottlenecks and sources of QoS degradation to escape detection until systems are fielded. To improve performance assessment of evolving systems across large configuration spaces, we have developed a distributed continuous quality assurance (DCQA) process called main effects screening that uses in-the-field resources to execute formally designed experiments to help reduce the configuration space, thereby allowing developers to perform more targeted in-house QA. We have evaluated this process via several feasibility studies on several large, widely-used performance-intensive software systems. Our results indicate that main effects screening can detect key sources of performance degradation in large-scale systems with significantly less effort than conventional techniques.

References

[1]
L. Breiman, J. Freidman, R. Olshen, and C. Stone. Classification and Regression Trees. Wadsworth, Monterey, CA, 1984.
[2]
K. Burr and W. Young. Combinatorial test techniques: Table-based automation, test generation and code coverage. In Proc. of the Intl. Conf. on Software Testing Analysis & Review, 1998.
[3]
B. Childers, J. Davidson, and M. Soffa. Continuous Compilation: A New Approach to Aggressive and Adaptive Code Transformation. In Proceedings of the International Parallel and Distributed Processing Symposium, Apr. 2003.
[4]
S. R. Dalal, A. Jain, N. Karunanithi, J. M. Leaton, C. M. Lott, G. C. Patton, and B. M. Horowitz. Model-based testing in practice. In Proc. of the Intl. Conf. on Software Engineering, (ICSE), pages 285--294, 1999.
[5]
I. S. Dunietz, W. K. Ehrlich, B. D. Szablak, C. L. Mallows, and A. Iannino. Applying design of experiments to software testing. In Proc. of the Intl. Conf. on Software Engineering, (ICSE '97), pages 205--215, 1997.
[6]
Kamen Yotov and Xiaoming Li and Gan Ren et.al. A Comparison of Empirical and Model-driven Optimization. In Proceedings of ACM SIGPLAN conference on Programming Language Design and Implementation, June 2003.
[7]
G. Karsai, J. Sztipanovits, A. Ledeczi, and T. Bapty. Model-Integrated Development of Embedded Software. Proceedings of the IEEE, 91(1):145--164, Jan. 2003.
[8]
A. S. Krishna, D. C. Schmidt, A. Porter, A. Memon, and D. Sevilla-Ruiz. Improving the Quality of Performance-intensive Software via Model-integrated Distributed Continuous Quality Assurance. In Proceedings of the 8th International Conference on Software Reuse, Madrid, Spain, July 2004. ACM/IEEE.
[9]
A. S. Krishna, N. Wang, B. Natarajan, A. Gokhale, D. C. Schmidt, and G. Thaker. CCMPerf: A Benchmarking Tool for CORBA Component Model Implementations. In Proceedings of the 10th Real-time Technology and Application Symposium (RTAS '04), Toronto, CA, May 2004. IEEE.
[10]
D. Kuhn and M. Reilly. An investigation of the applicability of design of experiments to software testing. Proc. 27th Annual NASA Goddard/IEEE Software Engineering Workshop, pages 91--95, 2002.
[11]
C. Lu, J. A. Stankovic, G. Tao, and S. H. Son. Feedback Control Real-Time Scheduling: Framework, Modeling, and Algorithms. Real-Time Systems Journal, 23(1/2):85--126, July 2002.
[12]
A. Memon, A. Porter, C. Yilmaz, A. Nagarajan, D. C. Schmidt, and B. Natarajan. Skoll: Distributed Continuous Quality Assurance. In Proceedings of the 26th IEEE/ACM International Conference on Software Engineering, Edinburgh, Scotland, May 2004. IEEE/ACM.
[13]
D. C. Schmidt, M. Stal, H. Rohnert, and F. Buschmann. Pattern-Oriented Software Architecture: Patterns for Concurrent and Networked Objects, Volume 2. Wiley & Sons, New York, 2000.
[14]
E. Turkay, A. Gokhale, and B. Natarajan. Addressing the Middleware Configuration Challenges using Model-based Techniques. In Proceedings of the 42nd Annual Southeast Conference, Huntsville, AL, Apr. 2004. ACM.
[15]
C. F. J. Wu and M. Hamada. Experiments: Planning, Analysis, and Parameter Design Optimization. Wiley, 2000.
[16]
C. Yilmaz, M. B. Cohen, and A. Porter. Covering arrays for efficient fault characterization in complex configuration spaces. In ISSTA, pages 45--54, 2004.
[17]
R. Zhang, C. Lu, T. Abdelzaher, and J. Stankovic. Controlware: A Middleware Architecture for Feedback Control of Software Performance. In Proceedings of the International Conference on Distributed Systems 2002, July 2002.

Cited By

View all
  • (2024)Maximizing Bioactive Compound Extraction from Mandarin (Citrus reticulata) Peels through Green Pretreatment TechniquesOxygen10.3390/oxygen40300184:3(307-324)Online publication date: 11-Aug-2024
  • (2018)Synthesizing programs that expose performance bottlenecksProceedings of the 2018 International Symposium on Code Generation and Optimization10.1145/3168830(314-326)Online publication date: 24-Feb-2018
  • (2016)Mining performance regression inducing code changes in evolving softwareProceedings of the 13th International Conference on Mining Software Repositories10.1145/2901739.2901765(25-36)Online publication date: 14-May-2016
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ICSE '05: Proceedings of the 27th international conference on Software engineering
May 2005
754 pages
ISBN:1581139632
DOI:10.1145/1062455
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 15 May 2005

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. design of experiment theory
  2. distributed continuous quality assurance
  3. performance-oriented regression testing

Qualifiers

  • Article

Conference

ICSE05
Sponsor:

Acceptance Rates

Overall Acceptance Rate 276 of 1,856 submissions, 15%

Upcoming Conference

ICSE 2025

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)5
  • Downloads (Last 6 weeks)0
Reflects downloads up to 20 Jan 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Maximizing Bioactive Compound Extraction from Mandarin (Citrus reticulata) Peels through Green Pretreatment TechniquesOxygen10.3390/oxygen40300184:3(307-324)Online publication date: 11-Aug-2024
  • (2018)Synthesizing programs that expose performance bottlenecksProceedings of the 2018 International Symposium on Code Generation and Optimization10.1145/3168830(314-326)Online publication date: 24-Feb-2018
  • (2016)Mining performance regression inducing code changes in evolving softwareProceedings of the 13th International Conference on Mining Software Repositories10.1145/2901739.2901765(25-36)Online publication date: 14-May-2016
  • (2015)An industrial case study on the automated detection of performance regressions in heterogeneous environmentsProceedings of the 37th International Conference on Software Engineering - Volume 210.5555/2819009.2819034(159-168)Online publication date: 16-May-2015
  • (2015)Cost-efficient sampling for performance prediction of configurable systemsProceedings of the 30th IEEE/ACM International Conference on Automated Software Engineering10.1109/ASE.2015.45(342-352)Online publication date: 9-Nov-2015
  • (2014)Performance regression testing of concurrent classesProceedings of the 2014 International Symposium on Software Testing and Analysis10.1145/2610384.2610393(13-25)Online publication date: 21-Jul-2014
  • (2014)Performance regression testing target prioritization via performance risk analysisProceedings of the 36th International Conference on Software Engineering10.1145/2568225.2568232(60-71)Online publication date: 31-May-2014
  • (2013)Human performance regression testingProceedings of the 2013 International Conference on Software Engineering10.5555/2486788.2486809(152-161)Online publication date: 18-May-2013
  • (2013)Testing Distributed Communication Protocols by Formal Performance MonitoringEvaluation of Novel Approaches to Software Engineering10.1007/978-3-642-54092-9_8(110-125)Online publication date: 2013
  • (2010)Monitoring, analysis, and testing of deployed softwareProceedings of the FSE/SDP workshop on Future of software engineering research10.1145/1882362.1882417(263-268)Online publication date: 7-Nov-2010
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media