skip to main content
10.1145/1066677.1067016acmconferencesArticle/Chapter ViewAbstractPublication PagessacConference Proceedingsconference-collections
Article

Towards the prioritization of regression test suites with data flow information

Published:13 March 2005Publication History

ABSTRACT

Regression test prioritization techniques re-order the execution of a test suite in an attempt to ensure that defects are revealed earlier in the test execution phase. In prior work, test suites were prioritized with respect to their ability to satisfy control flow-based and mutation-based test adequacy criteria. In this paper, we propose an approach to regression test prioritization that leverages the all-DUs test adequacy criterion that focuses on the definition and use of variables within the program under test. Our prioritization scheme is motivated by empirical studies that have shown that (i) tests fulfilling the all-DUs test adequacy criteria are more likely to reveal defects than those that meet the control flow-based criteria, (ii) there is an unclear relationship between all-DUs and mutation-based criteria, and (iii) mutation-based testing is significantly more expensive than testing that relies upon all-DUs.In support of our prioritization technique, we provide a formal statement of the algorithms and equations that we use to instrument the program under test, perform test suite coverage monitoring, and calculate test adequacy. Furthermore, we examine the architecture of a tool that implements our novel prioritization scheme and facilitates experimentation. The use of this tool in a preliminary experimental evaluation indicates that, for three case study applications, our prioritization can be performed with acceptable time and space overheads. Finally, these experiments also demonstrate that the prioritized test suite can have an improved potential to identify defects earlier during the process of test execution.

References

  1. Boris Beizer. Software Testing Techniques, Van Nostrond Reinhold, New York, NY, 1990. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. James Bieman, Sudipto Ghosh, and Roger Alexander. A technique for mutation of Java objects. In Proceedings of the 16th IEEE International Conference on Automated Software Engineering, November 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Evelyn Duesterwald, Rajiv Gupta, and Mary Lou Soffa. A demand-driven analyzer for data flow testing at the integration level. In Proceedings of the 18th International Conference on Software Engineering, pages 575--584, 1996. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Sebastian Elbaum, Alexey G. Malishevsky, and G. Rothermel. Prioritizing test cases for regression testing. In Proceedings of the International Symposium on Software Testing and Analysis, pages 102--112. ACM Press, August 2000. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Sebastian Elbaum, Gregg Rothermel, Satya Kanduri, and Alexey G. Malishevsky. Selecting a cost-effective test case prioritization technique. Technical Report 03-01-01, Department of Computer Science and Engineering, University of Nebraska - Lincoln, January 2003.Google ScholarGoogle Scholar
  6. Sebastian G. Elbaum, Alexey G. Malishevsky, and Gregg Rothermel. Incorporating varying test costs and fault severities into test case prioritization. In International Conference on Software Engineering, pages 329--338, 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Phyllis G. Frankl, Stewart N. Weiss, and Cang Hu. All-uses vs mutation testing: an experimental comparison of effectiveness. J. Syst. Softw., 38(3):235--253, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Phyllis G. Frankl and Elaine J. Weyuker. An applicable family of data flow testing criteria. IEEE Transactions on Software Engineering, 14(10):1483--1498, October 1988. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Dick Hamlet and Joe Maybee. The Engineering of Software. Addison Wesley, Boston, MA, 2001.Google ScholarGoogle Scholar
  10. Michael Harder, Jeff Mellen, and Michael D. Ernst. Improving test suites via operational abstraction. In Proceedings of the 24th International Conference on Software Engineering, pages 60--71, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Monica Hutchins, Herb Foster, Tarak Goradia, and Thomas Ostrand. Experiments of the effectiveness of dataflow- and controlflow-based test adequacy criteria. In Proceedings of the 16th International Conference on Software Engineering, pages 191--200. IEEE Computer Society Press, 1994. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Gregory M. Kapfhammer. The Computer Science Handbook, chapter Software Testing. CRC Press, June 2004.Google ScholarGoogle Scholar
  13. Gregory M. Kapfhammer and Mary Lou Soffa. A family of test adequacy criteria for database-driven applications. In Proceedings of the 9th European Software Engineering Conference and the 11th ACM SIGSOFT Symposium on Foundations of Software Engineering, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Yu-Seung Ma, Yong-Rae Kwon, and Jeff Offutt. Inter-class mutation operators for Java. In Proceedings of the Twelfth International Symposium on Software Reliability Engineering, November 2002. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. G. Rothermel and M. J. Harrold. A framework for evaluating regression test selection techniques. In Proceedings of the 16th International Conference on Software Engineering, pages 201--210. IEEE Computer Society Press, May 1994. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. G. Rothermel, R. H. Untch, C. Chu, and M. J. Harrold. Test case prioritization: An empirical study. In Proceedings of the International Conference on Software Maintenance, pages 179--188, August 1999. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. G. Rothermel, Roland H. Untch, Chengyun Chu, and M. J. Harrold. Prioritizing test cases for regression testing. IEEE Transactions on Software Engineering, 27(10):929--948, October 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Raja Vallée-Rai, Laurie Hendren, Vijay Sundaresan, Patrick Lam, Etienne Gagnon, and Phong Co. Soot - a Java optimization framework. In Proceedings of CASCON 1999, pages 125--135, 1999.Google ScholarGoogle ScholarDigital LibraryDigital Library

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Conferences
    SAC '05: Proceedings of the 2005 ACM symposium on Applied computing
    March 2005
    1814 pages
    ISBN:1581139640
    DOI:10.1145/1066677

    Copyright © 2005 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 13 March 2005

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • Article

    Acceptance Rates

    Overall Acceptance Rate1,650of6,669submissions,25%

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader