skip to main content
10.1145/1295014.1295036acmconferencesArticle/Chapter ViewAbstractPublication PagesfseConference Proceedingsconference-collections
Article

State coverage: a structural test adequacy criterion for behavior checking

Authors Info & Claims
Published:03 September 2007Publication History

ABSTRACT

We propose a new language-independent, structural test adequacy criterion called state coverage. State coverage measures whether unit-level tests check the outputs and side effects of a program.

State coverage differs in several respects from existing test adequacy criteria, such as code coverage and mutation adequacy. Unlike other coverage-based criteria, state coverage measures the extent of checks of program behavior. And unlike existing fault-based criteria such as mutation adequacy, state coverage has been designed to be readily automated and to present users with easily understood test inadequacy reports.

An experiment showed strong positive correlations between the number of behavior checks and both state coverage and mutation adequacy.

References

  1. D. Astels. One assertion per test. http://www.artima.com/weblogs/viewpost.jsp?thread=35578.Google ScholarGoogle Scholar
  2. K. Beck. Kent Beck's Guide to Better Smalltalk. Cambridge University Press, 1998. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. T. Burns. Effective unit testing. ACM Ubiquity, 1(42), 2001. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. R. A. DeMillo, R. J. Lipton, and F. G. Sayward. Hints on test data selection: Help for the practicing programmer. IEEE Computer, 11(4), April 1978. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. E. Duesterwald, R. Gupta, and M. L. Soffa. Rigorous data flow testing through output influences. In Second Irvine Software Symposium, 1992.Google ScholarGoogle Scholar
  6. R. G. Hamlet. Testing programs with the aid of a compiler. IEEE Trans. Softw. Eng., 3(4), July 1977. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. C. A. R. Hoare. An axiomatic basis for computer programming. Commun. ACM, 12(10):576--580, 1969. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. D. Hyland-Wood and D. Carrington. 2006 software engineering practices survey summary of results. Technical report, The University of Queensland, 2007. http://www.itee.uq.edu.au/~dwood/2006SEPSurvey/2006SEPSurveyResults.html.Google ScholarGoogle Scholar
  9. Y.-S. Ma, J. Offutt, and Y. R. Kwon. MuJava: An automated class mutation system. Software Testing, Verification and Reliability, 15(2):97--133, June 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. V. P. Ranganath and J. Hatcliff. Slicing concurrent Java programs using Indus and Kaveri, 2005. http://people.cis.ksu.edu/~rvprasad/publications/sttt05-submission.pdfGoogle ScholarGoogle Scholar
  11. S. Rapps and E. Weyuker. Selecting software test data using data flow information. IEEE Trans. Softw. Eng., 11(4):367--375, 1985. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. P. Runeson. A survey of unit testing practices. IEEE Software, 23(4):22--29, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. F. Tip. A survey of program slicing techniques. Journal of Programming Languages, 3:121--189, 1995.Google ScholarGoogle Scholar
  14. R. Torkar and S. Mankefors. A survey on testing and reuse. In Proceedings of the 2003 IEEE International Conference on Software-Science, Technology & Engineering, 2003. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. M. Weiser. Program slicing. In Proceedings of the 5th International Conference on Software Engineering, 1981. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. H. Zhu, P. A. V. Hall, and J. H. R. May. Software unit test coverage and adequacy. ACM Computing Surveys, 29(4):366--427, 1997. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. State coverage: a structural test adequacy criterion for behavior checking

      Recommendations

      Comments

      Login options

      Check if you have access through your login credentials or your institution to get full access on this article.

      Sign in
      • Published in

        cover image ACM Conferences
        ESEC-FSE companion '07: The 6th Joint Meeting on European software engineering conference and the ACM SIGSOFT symposium on the foundations of software engineering: companion papers
        September 2007
        189 pages
        ISBN:9781595938121
        DOI:10.1145/1295014

        Copyright © 2007 ACM

        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        • Published: 3 September 2007

        Permissions

        Request permissions about this article.

        Request Permissions

        Check for updates

        Qualifiers

        • Article

        Acceptance Rates

        Overall Acceptance Rate112of543submissions,21%

        Upcoming Conference

        FSE '24

      PDF Format

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader