Skip to main content
Log in

Defect propagation at the project-level: results and a post-hoc analysis on inspection efficiency

  • Published:
Empirical Software Engineering Aims and scope Submit manuscript

Abstract

Inspections are increasingly utilized to enhance software quality. While the effectiveness of inspections in uncovering defects is widely accepted, there is a lack of research that takes a more holistic approach by considering defect counts from initial phases of the development process (requirements, design, and coding) and examining defect propagation where defect counts are aggregated to the project-level (i.e., application-level). Using inspection data collected from a large software development firm, this paper investigates the extent of defect propagation at the project-level during early lifecycle phases. I argue that defect propagation can be observed from the relationship between defects in the prior phase and the defects in the subsequent phase. Both Ordinary Least Squares and 3-Stage Least Squares analyses support the hypotheses on defect propagation. Moreover, results show that the inspection efficiency (defects per unit inspection time) decreases as the software product progresses from requirements to design to coding. A post-hoc analysis revealed further insights into inspection efficiency. In each phase, as the inspection time increased, efficiency reached an optimal point and then dropped off. In addition, a project’s inspection efficiency generally tends to remain stable from one phase to another. These insights offer managers means to assess inspections, their efficiency, and make adjustments to the time allotted to inspect project’s artifacts in both the current and the subsequent phase. Implications for managers and future research directions are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Notes

  1. The role of inspection is to merely find defects, not correct them.

  2. Start of this period coincided with SwDevCo’s initiative to systematically collect inspection data across requirements, design, and coding phases for projects (applications).

  3. Aggregated “number of inspections” corresponds to sum of all inspections conducted for a particular project during a given phase (e.g., requirement analysis).

  4. Aggregated “number of reviewers” corresponds to sum of reviewers of all inspections for a particular project (during a given phase), not distinct number of reviewers for that project.

  5. Aggregated “inspection time” include both preparation and review time for a particular project during a given phase.

  6. Complexity refers to the complexity of the inspection document that is aggregated to the project level.

  7. In Winsorizing at 90, 0–5 % percentile values are set to 5 % while 95–100 % percentile values are set to 95 %.

  8. VIF shows the severity of multicollinearity in OLS regression. VIF offers a measure of how much the variance of an estimated regression coefficient is increased because of multicollinearity.

  9. Intercept in the polynomial function is set to 0 since a hypothetical inspection time close to zero hours results in zero defects and hence, zero efficiency.

  10. For example, highest 20  %, higher 20 %, middle 20 %, etc.

  11. After initial analysis, the number of inspectors was dropped to reduce multicollinearity.

  12. A major defect in requirement could result in multiple design defects while a minor requirement defect (e.g., misspelled word in the requirement specification) might not even result in a design defect.

  13. I want to thank one of the reviewers for the observation that the inspection of an artifact with a lower level of abstraction downstream can simplify the identification of defects with all severities that have propagated from an artifact with a higher level of abstraction upsteam in the software development process.

References

  • Abdelmoez W, Nassar DM, Shereschevsky M, Gradetsky N, Gunnalan R, Ammar H, Yu B, Mili A (2004) Error propagation in software architectures. In: Proceedings of the tenth international symposium on software metrics

  • Banker RD, Kemerer CF (1989) Scale economies in new software development. IEEE Trans Softw Eng 15(10):416–429

    Article  Google Scholar 

  • Berling T, Thelin T (2003) An industrial case study of the verification and validation activities. In: Proceedings of the ninth international symposium on software metrics

  • Biffl S (2000) Using inspection data for defect estimation. IEEE Softw 29(5):385–397

    Article  Google Scholar 

  • Biffl S, Halling M (2003) Investigating the defect detection effectiveness and cost benefit of nominal inspection teams. IEEE Trans Softw Eng 29(5):385–397

    Article  Google Scholar 

  • Boehm BW, Abts C, Brown A, Chulani S, Clark B, Horowitz E, Madachy R, Reifer D, Steece B (2000) Software cost estimation with COCOMO II. Prentice-Hall, New Jersey

    Google Scholar 

  • Briand LC, Laitenberger O, Wieczorek I (1997) Building resource and quality management models for software inspections. In: Proceedings of the international software consulting network

  • Briand L, El Emam K, Laitenberger O, Fussbroich T (1998) Using simulation to build inspection efficiency benchmarks for development projects. In: Proceedings of the twentieth international conference on software engineering, 340–349

  • Briand L, El Emam K, Freimut B, Laitenberger O (2000) A comprehensive evaluation of capture-recapture models for estimating software defect content. IEEE Trans Softw Eng 26(6):518–540

    Article  Google Scholar 

  • Briand L, Freimut B, Vollei F (2004) Using multiple adaptive regression splines to support decision. J Syst Softw 73:205–217

    Article  Google Scholar 

  • Christenson DA, Huang ST (1988) A code inspection model for software quality management and prediction. Proc IEEE Glob Telecommun Conf 1:468–472

    Google Scholar 

  • Davidson R, MacKinnon JG (1993) Estimation and inference in econometrics. Oxford University Press, UK

    MATH  Google Scholar 

  • Ebenau RG (1994) Predictive quality control with software inspection. CrossTalk 7(6):9–16

    Google Scholar 

  • Ebert C, Jones C (2009) Embedded software: facts, figures, and future. Computer 42:42–52

    Article  Google Scholar 

  • Fagan ME (1986) Advances in software inspections. IEEE Trans Softw Eng 12(7):744–751

    Article  Google Scholar 

  • Fagan ME (2002) A History of Software Inspections. In: Broy M, Denert E (eds) Software design and management conference, software pioneers: contributions to software engineering. Springer, New York

    Google Scholar 

  • Fenton NE, Neil M (1999) A critique of software defect prediction models. IEEE Trans Softw Eng 25(5):675–689

    Article  Google Scholar 

  • Freimut B, Briand L, Vollei F (2005) Determining inspection cost-effectiveness by combining project data and expert opinion. IEEE Trans Softw Eng 31(12):1074–1092

    Article  Google Scholar 

  • Greene WH (1993) Econometric analysis, 2nd edn. Prentice-Hall, New Jersey

    Google Scholar 

  • Greene WH (2002) Econometric analysis, 5th edn. Prentice-Hall, New Jersey

    Google Scholar 

  • Harter D, Slaughter SA (2000) Process maturity and software quality: a field study. In: Proceedings of the international conference on information systems, 407–411

  • Jacobs J, van Moll J, Kusters R, Trienekens J, Brombacher A (2007) Identification of factors that influence defect injection and detection in development of software intensive products. Inf Softw Technol 49:774–789

    Article  Google Scholar 

  • James LR, Singh BK (1978) An introduction to the logic, assumptions, and basic analytic procedures of two-stage least squares. Psychol Bull 85(5):1104–1122

    Article  Google Scholar 

  • Johnson PM (1998) Reengineering inspection. Commun ACM 41(2):49–52

    Article  Google Scholar 

  • Kelly JC, Sherif JS, Hops J (1992) An analysis of defect densities found during software inspections. J Syst Softw 17:111–117

    Article  Google Scholar 

  • Kemerer CF, Paulk MC (2009) The impact of design and code reviews on software quality: an empirical study based on PSP data. IEEE Trans Softw Eng 35(4):534–550

    Article  Google Scholar 

  • Kitchenham B, Pfleeger S (1996) Software quality: the elusive target. IEEE Softw 13(1):12–21

    Article  Google Scholar 

  • Laitenberger O (2001) Cost-effective detection of software defects through perspective-based inspections. Empir Softw Eng 6:81–84

    Article  MATH  Google Scholar 

  • Laitenberger O, Leszak M, Stoll D, El Emam K (1999) Quantitative modeling of software reviews in an industrial setting. In: Proceedings of the sixth IEEE symposium on software metrics

  • Laitenberger O, Emam K, Harbich T (2001) An internally replicated quasi-experimental comparison of checklist and perspective-based reading of code documents. IEEE Trans Softw Eng 27(5):387–421

    Article  Google Scholar 

  • Laitenberger O, Beil T, Schwinn T (2002) An industrial case study to examine a non-traditional inspection implementation for requirements specifications. Empir Softw Eng 7:345–374

    Article  MATH  Google Scholar 

  • Maddala GS (2008) Introduction to econometrics, 3rd edn. Wiley, New York

    Google Scholar 

  • Mandala N, Carver JC, Nagappan N (2012) Application of Kusumoto cost-metric to evaluate the cost effectiveness of software inspections. In: Proceeding of the ACM-IEEE international symposium on empirical software engineering and measurement, 221–230

  • Mantyla MV, Lassenius C (2009) What types of defects are really discovered in code reviews. IEEE Trans Softw Eng 35(3):430–448

    Article  Google Scholar 

  • Miller J, Yin Z (2004) A cognitive-based mechanism for constructing software inspection teams. IEEE Trans Softw Eng 30(11):811–825

    Article  Google Scholar 

  • Miller J, Wood M, Roper M (1998) Further experiences with scenarios and checklists. Empir Softw Eng 3:37–64

    Article  Google Scholar 

  • Montagud S, Abrahao S, Insfran E (2012) A systematic review of quality attributes and measures for software product lines. Softw Qual J 20:425–486

    Article  Google Scholar 

  • Myers GJ (1979) The Art of Software Testing. Wiley, New York

    MATH  Google Scholar 

  • O’Neill D (1997) Issues in software inspection. IEEE Softw 14(1):18–19

    Article  Google Scholar 

  • Porter AA, Siy HP, Toman CA, Votta LG (1997) An experiment to assess the cost-benefits of code inspections in large scale software development. IEEE Trans Softw Eng 23(6):329–346

    Article  Google Scholar 

  • Raz T, Barad M (2004) In-process control of design inspection effectiveness. Int J Qual Reliab Eng 20:17–30

    Article  Google Scholar 

  • Raz T, Yaung AT (1994) Inspection effectiveness in software development: a neural network approach. In: Proceedings of the conference of the centre for advances studies on collaborative research, 61–72

  • Raz T, Yaung AT (1997) Factors affecting design inspection effectiveness in software development. Inf Softw Technol 39:297–305

    Article  Google Scholar 

  • Rigby PC, German DM, Cowen L, Storey M (2014) Peer review on open source software projects: parameters, statistical models, and theory. ACM Trans Softw Eng Methodol 23(4):35–67

    Article  Google Scholar 

  • Runeson P, Andersson C, Thelin T, Andrews A, Berling T (2006) What do we know about defect detection methods? IEEE Softw 23(3):82–90

    Article  Google Scholar 

  • Sauer C, Jeffery D, Land L, Yetton P (2000) The effectiveness of software development technical reviews: a behaviorally motivated program of research. IEEE Trans Softw Eng 26(1):1–14

    Article  Google Scholar 

  • Schmitt N, Bedeian AG (1982) A comparison of LISREL and Two-stage least squares analysis of a hypothesized life-job satisfaction reciprocal relationship. J Appl Psychol 67(6):806–817

    Article  Google Scholar 

  • Strauss S, Ebenau R (1994) Software inspection process. McGraw-Hill, New York

    MATH  Google Scholar 

  • Unterkalmsteiner M, Gorschek T, Moinul Islam AKM, Cheng CK, Permadi RB, Feldt R (2012) Evaluation and measurement of software process improvement—a systematic literature review. IEEE Trans Softw Eng 38(2):398–424

    Article  Google Scholar 

  • Vitharana P, Ramamurthy K (2003) Computer-mediated group support, anonymity, and the software inspection process: an empirical investigation. IEEE Trans Softw Eng 29(2):167–180

    Article  Google Scholar 

  • Wagner S (2006) A model and sensitivity analysis of the quality economics of defect-detection techniques. In: Proceedings of the ACM/SIGSOFT international symposium on software testing and analysis, 73–83

  • Westland J (2004) The cost behavior of software defects. Decis Support Syst 37(2):229–238

    Article  Google Scholar 

  • Winkler D, Biffl S, Faderl K (2010) Investigating the temporal behavior of defect detection in software inspection and inspection-based testing. In: Baber, M A Vierimaa, M Oivo M (eds) Product-focused software process improvement. Proceedings of the eleventh international conference on product-focused software process improvement, 17–31

  • Yu T, Shen VY, Dunsmore HE (1988) An analysis of several software defect models. IEEE Trans Softw Eng 14(9):1261–1270

    Article  Google Scholar 

  • Zellner A, Theil H (1962) Three-stage least squares: simultaneous estimation of simultaneous equations. Econometrica 30(1):54–78

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgments

The author thanks Empirical Software Engineering Editors-in-Chief and the review team for guidance through the review process. This research was partly funded by grants from the Earl V. Snyder Innovation Management Center and Robert H. Brethen Operations Management Institute at the Whitman School of Management, Syracuse University.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Padmal Vitharana.

Additional information

Communicated by: Tony Gorschek

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Vitharana, P. Defect propagation at the project-level: results and a post-hoc analysis on inspection efficiency. Empir Software Eng 22, 57–79 (2017). https://doi.org/10.1007/s10664-015-9415-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10664-015-9415-3

Keywords

Navigation