Abstract
Software inspection is an effective method of defect detection. Recent research activity has considered the development of tool support to further increase the efficiency and effectiveness of inspection, resulting in a number of prototype tools being developed. However, no comprehensive evaluations of these tools have been carried out to determine their effectiveness in comparison with traditional paper-based inspection. This issue must be addressed if tool-supported inspection is to become an accepted alternative to, or even replace, paper-based inspection. This paper describes a controlled experiment comparing the effectiveness of tool-supported software inspection with paper-based inspection, using a new prototype software inspection tool known as ASSIST (Asynchronous/Synchronous Software Inspection Support Tool). 43 students used ASSIST and paper-based inspection to inspect two C++ programs of approximately 150 lines. The subjects performed both individual inspection and a group collection meeting, representing a typical inspection process. It was found that subjects performed equally well with tool-based inspection as with paper-based, measured in terms of the number of defects found, the number of false positives reported, and meeting gains and losses.
Similar content being viewed by others
Explore related subjects
Discover the latest articles and news from researchers in related subjects, suggested using machine learning.References
Baldwin, John T. 1992. An abbreviated code inspection checklist. URL: http://www.ics.hawaii.edu/ ~johnson/FTR/Bib/Baldwin92.html
Barnard, J., and Price, A. 1994. Managing code inspection information. IEEE Software 11(2): 56–69.
Basili, V. R., and Selby, R.W. 1987. Comparing the effectiveness of software testing strategies. IEEE Transactions on Software Engineering 13(12): 1278–1296.
Boneau, C. A. 1960. The effects of violations of assumptions underlying the t test. Psychological Bulletin 57(1): 49–64.
Brothers, L. R., Sembugamoorthy, V., and Irgon, A. E. 1992. Knowledge-based code inspection with ICICLE. In Innovative Applications of Artificial Intelligence 4: Proceedings of IAAI-92.
Deitel, H. M., and Deitel, P.J. 1994. C: How to Program. Prentice-Hall International, second edition.
Dillon, A. 1992. Reading from paper versus screens: a critical review of the empirical literature. Ergonomics 35(10): 1297–1326.
Doolan, E. P. 1992. Experience with Fagan's inspection method. Software—Practice and Experience 22(2): 173–182.
Ebenau, R. G., and Strauss, S. H. 1994 Software Inspection Process. McGraw-Hill.
Edwards, A. L. 1967. Statistical Methods. Holt, Rinehart, and Winston, Inc., second edition.
Fagan, M. E. 1976. Design and code inspections to reduce errors in program development. IBM Systems Journal 15(3): 182–211.
Gilb, T., and Graham D. 1993. Software Inspection. Addison-Wesley, Wokingham, England.
Gintell, J.W., Arnold, J., Houde, M., Kruszelnicki, J., McKenney, R., and Memmi, G. 1993. Scrutiny: A collaborative inspection and review system. In Proceedings of the Fourth European Software Engineering Conference.
Gintell, J.W., Houde, M., and McKenney, R. 1995. Lessons learned by building and using Scrutiny, a collaborative software inspection system. In Proceedings of the Seventh International Workshop on Computer Aided Software Engineering.
Humphrey, W. S. 1995. A Discipline for Software Engineering. Addison-Wesley.
Kamsties, E., and Lott, C. M. 1995. An empirical evaluation of three defect-detection techniques. Technical Report ISERN-95-02, International Software Engineering Research Network.
Knight, J. C., and Meyers, E. A. 1993. An improved inspection technique. Communications of the ACM 36(11): 51–61.
Macdonald, F., and Miller, J. 1997. A software inspection process definition language and prototype support tool. Software Testing, Verification and Reliability, 7(2): 99–128.
Macdonald, F., Miller, J., Brooks, A., Roper, M., and Wood, M. 1995. A review of tool support for software inspection. In Proceedings of the Seventh International Workshop on Computer Aided Software Engineering, 340–349.
Macdonald, F., Miller, J., Brooks, A., Roper, M., and Wood, M. 1996. Automating the software inspection process. Automated Software Engineering: An International Journal, 3(3/4): 193–218.
Marick, B. 1992. A question catalog for code inspections. Available via anonymous FTP from cs.uiuc.edu as /pub/testing/inspect.ps
Mashayekhi, V. 1995. Distribution and Asynchrony in Software Engineering. Ph.D. thesis, University of Minnesota, March 1995.
Miller, J., Daly, J., Wood, M., Brooks, A., and Roper, M. 1997. Statistical power and its subcomponents—missing and misunderstood concepts in empirical software engineering research. Information and Software Technology, 39(4): 285–295.
Miller, Roper, M., and Woord M.. 1998. Further experiences with scenarios and checklists. Journal of Empirical Software Engineering, 3(1): 37–64.
Porter A. A., Votta, L. G., and Basili, V. R. 1995. Comparing detection methods for software requirements inspections: A replicated experiment. IEEE Transactions on Software Engineering, 21(6): 563–575.
Russell, G. W. 1991. Experience with inspections in ultralarge-scale development, IEEE Software, 8(1): 25–31.
Tjahjono, D. 1995. Comparing the cost effectiveness of group synchronous review method and individual asynchronous review method using CSRS: Results of pilot study. Technical Report ICS-TR-95-07, University of Hawaii.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
MacDonald, F., Miller, J. A Comparison of Tool-Based and Paper-Based Software Inspection. Empirical Software Engineering 3, 233–253 (1998). https://doi.org/10.1023/A:1009747104814
Issue Date:
DOI: https://doi.org/10.1023/A:1009747104814