Skip to main content
Log in

Software testability measurement for intelligent assertion placement

  • Published:
Software Quality Journal Aims and scope Submit manuscript

Abstract

Voas defines software testability as the degree to which software reveals faults during testing. This software characteristic is important when determining how to best apply verification techniques and build quality assurance plans. When testability is low, testers often want advice on how to increase it. In this paper, we describe the use of testability measures (using VoasÕs definition) for intelligent assertion placement. Software assertions are one relatively simple trick for improving testability. VoasÕs perspective on what software testability is has been implemented via three algorithms that together comprise a technique termed Ôsensitivity analysisÕ. Sensitivity analysis analyses how likely a test scheme is to (1) propagate data state errors to the output space, (2) cause internal states to become corrupted when faults are exercised, and (3) exercise the code. By knowing where faults appear likely to hide from a particular test scheme, we have insight into where internal tests (assertions) are particularly beneficial. This paper explores using the one sensitivity analysis algorithm that measures propagation as a heuristic for where and how to inject software assertions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Voas J. PIE: a dynamic failure-based technique. IEEE Transactions on Software Engineering, 18(1992) 717-727.

    Google Scholar 

  2. Bertolino A. and Strigini L. On the use of testability measures for dependability assessment. IEEE Transactions on Software Engineering, 22(1995) 97-108.

    Google Scholar 

  3. Luckham D. and von Henke F. An overview of ANNA, a specification language for Ada. IEEE Software, March (1985) 9-22.

  4. Meyer B. Eiffel the Language (Prentice-Hall, Englewood Cliffs, NJ, 1992).

    Google Scholar 

  5. Ammann P.E., Brilliant S.S. and Knight J.C. The effect of imperfect error detection on reliability assessment via life testing. IEEE Transactions on Software Engineering, 20(1994) 142-148.

    Google Scholar 

  6. Reliable Software Technologies Corporation. Testability of object-oriented systems. Technical report, Sterling, Virginia, December 1994. Final Report for Contract 50-DKNA-4-00119.

  7. Yin H. and Bieman J.M. Improving software testability with assertion insertion. In Proceedings of International Test Conference, October 1994.

  8. Voas J. and Miller K. Putting assertions in their place. In Proceedings of the International Symposium on Software Reliability Engineering, pp. 152-157, Monterey, CA, November 1994. IEEE Computer Society Press.

  9. Voas J. Software testability measurement for assertion placement and fault localization. In Proceedings of 2nd Workshop on Automated and Algorithmic Debugging, St Malo, France, May 1995.

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Voas, J. Software testability measurement for intelligent assertion placement. Software Quality Journal 6, 327–336 (1997). https://doi.org/10.1023/A:1018532607070

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1018532607070

Navigation