Skip to main content
Log in

Test-Data Generation Guided by Static Defect Detection

  • Regular Paper
  • Published:
Journal of Computer Science and Technology Aims and scope Submit manuscript

Abstract

Software testing is an important technique to assure the quality of software systems, especially high-confidence systems. To automate the process of software testing, many automatic test-data generation techniques have been proposed. To generate effective test data, we propose a test-data generation technique guided by static defect detection in this paper. Using static defect detection analysis, our approach first identifies a set of suspicious statements which are likely to contain faults, then generates test data to cover these suspicious statements by converting the problem of test-data generation to the constraint satisfaction problem. We performed a case study to validate the effectiveness of our approach, and made a simple comparison with another test-data generation on-line tool, JUnit Factory. The results show that, compared with JUnit Factory, our approach generates fewer test data that are competitive on fault detection.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Liu M, Gao Y, Shan J, Liu J, Zhang L, Sun J. An approach to test data generation for killing multiple mutants. In Proc. the 22nd IEEE International Conference on Software Maintenance, Philadelphia, USA, Sept. 24–27, 2006, pp.113–122.

  2. Beizer B. Software Testing Techniques. 2nd Edition, Van Nostrand Reinhold, 1990.

  3. Myers G. The Art of Software Testing. New York: John Wiley & Sons, USA, 1979.

    Google Scholar 

  4. Sommerville I. Software Engineering. 4th Edition, Addison-Wesley Publishing Company Inc., 1992.

  5. Stoerzer M, Ryder B G, Ren X, Tip F. Finding failure-inducing changes in Java programs using change classification. In Proc. the 14th ACM SIGSOFT Symposium on Foundations of Software Engineering, Portland, USA, Nov. 5–11, 2006, pp.57–68.

  6. Pressman R. Software Engineering: A Practitioner’s Approach. Boston: McGraw Hill, 2001.

    Google Scholar 

  7. Edvardson K. A survey on automatic test data generation. In Proc. the 2nd Conference on Computer Science and Engineering, Linköping, Sweden, October 1999, pp.21–28.

  8. Korel B, Al-Yami A M. Assertion-oriented automated test data generation. In Proc. the 18th International Conference on Software Engineering, Berlin, Germany, March 25–30, 1996, pp.71–80.

  9. http://tomcat.apache.org/.

  10. http://lucene.apache.org/.

  11. http://velocity.apache.org/.

  12. http://www.agitar.com/developers/junit_factory.html.

  13. Ayewah N, Pugh W, Morgenthaler J D, Penix J, Zhou Y. Evaluating static analysis defect warnings on production software. In Proc. the 7th ACM SIGPLAN-SIGSOFT Workshop on Program Analysis for Software Tools and Engineering, San Diego, California, USA, June 2007, pp.1–8.

  14. http://pmd.sourceforge.net.

  15. http://artho.com/jlint.

  16. Flanagan C, Leino K R M, Lillibridge M, Nelson G, Saxe J B, Stata R. Extended static checking for Java. In Proc. the 2002 ACM SIGPLAN Conference on Programming Language Design and Implementation, Berlin, Germany, June 17–19, 2002, pp.234–245.

  17. Rutar N, Almazan C B, Foster J S. A comparison of bug finding tools for Java. In Proc. the 15th International Symposium on Software Reliability Engineering, Saint-Malo, France, Nov. 2–5, 2004, pp.245–256.

  18. Ferguson R, Korel B. The chaining approach for software test data generation. IEEE Transactions on Software Engineering, January 1996, 5(1): 63–86.

    Google Scholar 

  19. Weyuker E J. The applicability of program schema results to programs. International Journal of Computer Information Sciences, 1979, 8(5): 387–403.

    Article  MATH  MathSciNet  Google Scholar 

  20. Ramamoorthy C, Ho S, Chen W. On the automated generation of program test data. IEEE Transactions on Software Engineering, 1976, SE-2(4): 293–300.

    Article  Google Scholar 

  21. Miller W, Spooner D L. Automatic generation of floating-point test data. IEEE Transactions on Software Engineering, 1976, SE-(3): 223–226.

    Article  MathSciNet  Google Scholar 

  22. Li J J, Weiss D, Yee D. Prioritize code for testing to improve code coverage of complex software. Information and Software Technology, 2006, 48(12): 1187–1198.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lu Zhang.

Additional information

This work is sponsored by the National High-Tech Research and Development 863 Program of China under Grant No. 2007AA010301, the National Natural Science Foundation of China under Grant Nos. 60803012 and 90718016, and China Postdoctoral Science Foundation funded project under Grant No. 20080440254.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Hao, D., Zhang, L., Liu, MH. et al. Test-Data Generation Guided by Static Defect Detection. J. Comput. Sci. Technol. 24, 284–293 (2009). https://doi.org/10.1007/s11390-009-9224-5

Download citation

  • Received:

  • Revised:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11390-009-9224-5

Keywords

Navigation