Skip to main content

Advertisement

Log in

A Comparison of Tool-Based and Paper-Based Software Inspection

  • Published:
Empirical Software Engineering Aims and scope Submit manuscript

Abstract

Software inspection is an effective method of defect detection. Recent research activity has considered the development of tool support to further increase the efficiency and effectiveness of inspection, resulting in a number of prototype tools being developed. However, no comprehensive evaluations of these tools have been carried out to determine their effectiveness in comparison with traditional paper-based inspection. This issue must be addressed if tool-supported inspection is to become an accepted alternative to, or even replace, paper-based inspection. This paper describes a controlled experiment comparing the effectiveness of tool-supported software inspection with paper-based inspection, using a new prototype software inspection tool known as ASSIST (Asynchronous/Synchronous Software Inspection Support Tool). 43 students used ASSIST and paper-based inspection to inspect two C++ programs of approximately 150 lines. The subjects performed both individual inspection and a group collection meeting, representing a typical inspection process. It was found that subjects performed equally well with tool-based inspection as with paper-based, measured in terms of the number of defects found, the number of false positives reported, and meeting gains and losses.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Explore related subjects

Discover the latest articles and news from researchers in related subjects, suggested using machine learning.

References

  • Baldwin, John T. 1992. An abbreviated code inspection checklist. URL: http://www.ics.hawaii.edu/ ~johnson/FTR/Bib/Baldwin92.html

  • Barnard, J., and Price, A. 1994. Managing code inspection information. IEEE Software 11(2): 56–69.

    Google Scholar 

  • Basili, V. R., and Selby, R.W. 1987. Comparing the effectiveness of software testing strategies. IEEE Transactions on Software Engineering 13(12): 1278–1296.

    Google Scholar 

  • Boneau, C. A. 1960. The effects of violations of assumptions underlying the t test. Psychological Bulletin 57(1): 49–64.

    Google Scholar 

  • Brothers, L. R., Sembugamoorthy, V., and Irgon, A. E. 1992. Knowledge-based code inspection with ICICLE. In Innovative Applications of Artificial Intelligence 4: Proceedings of IAAI-92.

  • Deitel, H. M., and Deitel, P.J. 1994. C: How to Program. Prentice-Hall International, second edition.

  • Dillon, A. 1992. Reading from paper versus screens: a critical review of the empirical literature. Ergonomics 35(10): 1297–1326.

    Google Scholar 

  • Doolan, E. P. 1992. Experience with Fagan's inspection method. Software—Practice and Experience 22(2): 173–182.

    Google Scholar 

  • Ebenau, R. G., and Strauss, S. H. 1994 Software Inspection Process. McGraw-Hill.

  • Edwards, A. L. 1967. Statistical Methods. Holt, Rinehart, and Winston, Inc., second edition.

  • Fagan, M. E. 1976. Design and code inspections to reduce errors in program development. IBM Systems Journal 15(3): 182–211.

    Google Scholar 

  • Gilb, T., and Graham D. 1993. Software Inspection. Addison-Wesley, Wokingham, England.

    Google Scholar 

  • Gintell, J.W., Arnold, J., Houde, M., Kruszelnicki, J., McKenney, R., and Memmi, G. 1993. Scrutiny: A collaborative inspection and review system. In Proceedings of the Fourth European Software Engineering Conference.

  • Gintell, J.W., Houde, M., and McKenney, R. 1995. Lessons learned by building and using Scrutiny, a collaborative software inspection system. In Proceedings of the Seventh International Workshop on Computer Aided Software Engineering.

  • Humphrey, W. S. 1995. A Discipline for Software Engineering. Addison-Wesley.

  • Kamsties, E., and Lott, C. M. 1995. An empirical evaluation of three defect-detection techniques. Technical Report ISERN-95-02, International Software Engineering Research Network.

  • Knight, J. C., and Meyers, E. A. 1993. An improved inspection technique. Communications of the ACM 36(11): 51–61.

    Google Scholar 

  • Macdonald, F., and Miller, J. 1997. A software inspection process definition language and prototype support tool. Software Testing, Verification and Reliability, 7(2): 99–128.

    Google Scholar 

  • Macdonald, F., Miller, J., Brooks, A., Roper, M., and Wood, M. 1995. A review of tool support for software inspection. In Proceedings of the Seventh International Workshop on Computer Aided Software Engineering, 340–349.

  • Macdonald, F., Miller, J., Brooks, A., Roper, M., and Wood, M. 1996. Automating the software inspection process. Automated Software Engineering: An International Journal, 3(3/4): 193–218.

    Google Scholar 

  • Marick, B. 1992. A question catalog for code inspections. Available via anonymous FTP from cs.uiuc.edu as /pub/testing/inspect.ps

  • Mashayekhi, V. 1995. Distribution and Asynchrony in Software Engineering. Ph.D. thesis, University of Minnesota, March 1995.

  • Miller, J., Daly, J., Wood, M., Brooks, A., and Roper, M. 1997. Statistical power and its subcomponents—missing and misunderstood concepts in empirical software engineering research. Information and Software Technology, 39(4): 285–295.

    Google Scholar 

  • Miller, Roper, M., and Woord M.. 1998. Further experiences with scenarios and checklists. Journal of Empirical Software Engineering, 3(1): 37–64.

    Google Scholar 

  • Porter A. A., Votta, L. G., and Basili, V. R. 1995. Comparing detection methods for software requirements inspections: A replicated experiment. IEEE Transactions on Software Engineering, 21(6): 563–575.

    Google Scholar 

  • Russell, G. W. 1991. Experience with inspections in ultralarge-scale development, IEEE Software, 8(1): 25–31.

    Google Scholar 

  • Tjahjono, D. 1995. Comparing the cost effectiveness of group synchronous review method and individual asynchronous review method using CSRS: Results of pilot study. Technical Report ICS-TR-95-07, University of Hawaii.

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

MacDonald, F., Miller, J. A Comparison of Tool-Based and Paper-Based Software Inspection. Empirical Software Engineering 3, 233–253 (1998). https://doi.org/10.1023/A:1009747104814

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1009747104814