Skip to main content
Log in

MND-SCEMP: an empirical study of a software cost estimation modeling process in the defense domain

  • Published:
Empirical Software Engineering Aims and scope Submit manuscript

Abstract

The primary focus of weapon systems research and development has moved from a hardware base to a software base and the cost of software development is increasing gradually. An accurate estimation of the cost of software development is now a very important task in the defense domain. However, existing models and tools for software cost estimation are not suitable for the defense domain due to problems of accuracy. Thus, it is necessary to develop cost estimation models that are appropriate to specific domains. Furthermore, most studies of methodology development are aligned with generic methodologies that do not consider the pertinent factors to specific domains, whereas new methodologies should reflect specific domains. In this study, we apply two generic methodologies to the development of a software cost estimation model, before suggesting an integrated modeling process specifically for the national defense domain. To validate our proposed modeling process, we performed an empirical study of 113 software development projects on weapon systems in Korea. A software cost estimation model was developed by applying the proposed modeling process. The MMRE value of this model was 0.566 while the accuracy was appropriate for use. We conclude that the modeling process and software cost estimation model developed in this study is suitable for estimating resource requirements during weapon system development in South Korea’s national defense domain. This modeling process and model may facilitate more accurate resource estimation by project planners, which will lead to more successful project execution.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  • Baik J, Boehm B, Steece BM (2002) Disaggregating and calibrating the case tool variable in COCOMO II. IEEE Trans Softw Eng 28. doi:10.1109/TSE.2002.1049401

  • Bailey JW, Basili VR (1981) A meta-model for software development resource expenditures. In: International conference on software engineering

  • Barbour R (2006) Cmmi: the dod perspective. SEI Presentation

  • Benediktsson O, Dalcher D, Reed K, Woodman M (2003) Cocomo-based effort estimation for iterative and incremental software development. Softw Qual J 11:265–281. doi:10.1023/A:1025809010217

    Article  Google Scholar 

  • Berntsson-Svenesson R, Aurum A (2006) Successful software project and products: an empirical investigation. In: International symposium on empirical software engineering. doi:10.1145/1159733.1159757

  • Boehm B, Abts C, Brown AW, Chulani S, Clark BK, Horowitz E, Madachy R, Reifer D, Steece B (2000a) Software cost estimation with COCOMO II. In: Software cost estimation with COCOMO II

  • Boehm B, Abts C, Chulani S (2000b) Software development cost estimation approach a survey. Ann Softw Eng 10:177–205. doi:10.1023/A:1018991717352

    Article  MATH  Google Scholar 

  • Boehm B, Brown W, Madachy R, Yang Y (2004) Software product line life cycle cost estimation model. In: ACM-IEEE international symposium on empirical software engineering. doi:10.1109/ISESE.2004.1334903

  • Briand LC, Wieczorek I (2002) Resource estimation in software engineering. Wiley, New York

    Google Scholar 

  • Cebrowski AK, Garstka JH (1998) Network-centric warfare its origin and future. In: U.S. Naval Institute proceedings

  • CHA (2011) http://www.standishgroup.com/newsroom/chaos_2009.php

  • Chulani S, Boehm B, Steece B (1999) Bayesian analysis of empirical software engineering cost models. IEEE Trans Softw Eng. doi:10.1109/32.799958

  • COC (2011) http://csse.usc.edu/csse/research/cocomoii/cocomodownloads.htm

  • Cod (2011) http://sunset.usc.edu/research/codecount/

  • Galorath D (2001) SEER-SEM. Galorath Inc.

  • GAO (1992) Software problem led to system failure at Dhahran, Saudi Arabia. U.S. Government Accountability Office

  • GAO (2005) Tactical aircraft, F/A-22 and JSF acquisition plans and implications for tactical aircraft modernization, statement of Michael Sullivan, director, acquisition and sourcing management issues. U.S. Government Accountability Office (GAO-05-519T)

  • Group CAW (2008) Levels of information systems interoperability

  • Heemstra FJ (1992) Software cost estimation. Inf Softw Technol 34:627–639

    Article  Google Scholar 

  • ISPA (1999) Parametric estimating handbook, 2nd edn. Parametric estimating initiative

  • Jones C (2007) Estimating software costs, 2nd edn. McGraw-Hill

  • Kokaguneli E, Menzies T, Keung JW (2011) Kernel methods for software effort estimation. Empir Software Eng 1–24. doi:10.1007/s10664-011-9189-1

  • Lee D, Baik J, Shin J (2009) Software reliability assurance using a framework in weapon system development: a case study. In: IEEE/ACIS international conference on computer and information science. doi:10.1109/ICIS.2009.168

  • Leung H, Fan Z (2002) Software cost estimation. In: Handbook of software engineering

  • Morisio M, Stamelos I, Spahos V, Romano D (1999) Measuring functionality and productivity in web-based applications: a case study. In: International software metric symposium

  • Myrtveit I, Stensrud E, Olsson UH (2001) Analyzing data sets with missing data: an empirical evaluation of imputation methods and likelihood-based methods. IEEE Trans Softw Eng 27:999–1013. doi:10.1109/32.965340

    Article  Google Scholar 

  • NCW (2005) The implementation of network-centric warfare. U.S. Department of Defense

  • Prieto-Diaz R (1990) Domain analysis: an introduction. ACM SIGSOFT Softw Eng Notes 15:47–54. doi:10.1145/382296.382703

    Article  Google Scholar 

  • Saleem B, Dhavachelvan P (2010) Analysis of empirical software effort estimation models. Int J Comput Sci Inf Secur 7:68–77

    Google Scholar 

  • Seo YS, Yoon KA, Bae DH (2008) An empirical analysis of software effort estimation with outlier elimination. In: International workshop on predictor models in software engineering. doi:10.1145/1370788.1370796

  • VDC (2006) The embedded software market intelligence program

Download references

Acknowledgement

“This work was supported by the National Research Foundation of Korea Grant funded by the Korean Government (MEST)” (NRF-2010-0014375).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Taeho Lee.

Additional information

Editor: Tim Menzies

Appendix: Cost factors affecting embedded software testing effort (see Section 4.3)

Appendix: Cost factors affecting embedded software testing effort (see Section 4.3)

1.1 A.1 Security

Security refers to the ability to prevent data loss and abuse by external invasion. In an NCW environment, much of the data is not contained in one weapon system. The data is transferred from one weapon system to another to maximize the capability of the weapon systems. If the data that is transferred among weapon systems is exposed, there may be considerable damage to property and the loss of human lives. To improve the security of these systems, software should be designed with consideration for security, which should be monitored and controlled continuously by quality assurance activities. Activities that are required to improve security can intensify the software development effort. We define the security of software using a three-level Likert scale, as shown in Table 10.

Table 10 Security level guideline

1.2 A.2 Interoperability

Interoperability is the ability to transfer data and use of transferred data between systems or components. In an NCW environment, weapon systems are not operated independently because they are closely connected. A weapons system can be viewed as a component of a system of systems. Thus, interoperability must be considered during software development. To improve interoperability, the data format and protocol should be defined initially and software developers should adhere to them. If more interoperability is required, more effort should be put into software development because the specific parts of connected systems should be considered. We adopted the LISI (Level of Information Systems Interoperability) model (Group 2008) as the scale of the interoperability levels, as shown in Table 11.

Table 11 LISI model

1.3 A.3 Hardware and software development simultaneity

Hardware and software development simultaneity indicates how long software development is delayed by protracted hardware development. This is because software is integrated into hardware during the system integration phase after hardware development is completed. If the hardware is developed earlier than the software, system testing can proceed without delay. The delay of hardware development makes resource allocation ineffective and increases the software development cost. Hardware and software development simultaneity is measured as the ratio of the delayed duration of software development caused by the delay in hardware development to the total duration of software development.

$$ Simutaneity = \frac{Delayed ~duration ~of ~software~ development(Month)}{Total ~duration~ of ~software ~development (Month)} $$
(3)

1.4 A.4 Hardware emulator quality

A hardware emulator is a mock-up of hardware that emulates the functions and interfaces of the target hardware. This emulator is used to facilitate system testing by overcoming the time constraints caused by hardware development delays. A good quality hardware emulator can lower the cost and shorten the duration of software development because it guarantees more time for testing and earlier detection of software defects. If software defects are identified earlier, the cost of correcting defects is lower. However, a poor quality hardware emulator can confuse software testing because we cannot guarantee that system defects are caused by software. In fact, it requires greater time and effort to fix hardware emulator defects. The hardware emulator quality is evaluated by considering the scope and the criticality of functions that are emulated and the number of defects that are inherent in the emulator. We defined the hardware emulator quality using a five-level Likert scale, as shown in Table 12.

Table 12 Emulator quality guideline

1.5 A.5 Hardware precedentedness

A hardware emulator is a mock-up of hardware that emulates the functions and interfaces of the target hardware. This emulator is used to facilitate system testing by overcoming the time constraints caused by hardware development delays. A good quality hardware emulator can lower the cost and shorten the duration of software development because it guarantees more time for testing and earlier detection of software defects. If software defects are identified earlier, the cost of correcting defects is lower. However, a poor quality hardware emulator can confuse software testing because we cannot guarantee that system defects are caused by software. In fact, it requires greater time and effort to fix hardware emulator defects. The hardware emulator quality is evaluated by considering the scope and the criticality of functions that are emulated and the number of defects that are inherent in the emulator. We defined the hardware emulator quality using a five-level Likert scale, as shown in Table 13.

Table 13 Hardware precedentedness

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lee, T., Gu, T. & Baik, J. MND-SCEMP: an empirical study of a software cost estimation modeling process in the defense domain. Empir Software Eng 19, 213–240 (2014). https://doi.org/10.1007/s10664-012-9220-1

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10664-012-9220-1

Keywords

Navigation