Abstract
Benchmarking is one of the most important methods to learn the best practices for software process improvement. However, in current software process context, benchmarking is mainly for projects rather than software development tasks. Can we benchmark software development tasks? If so, how to? Moreover, benchmarking software development tasks has to deal with multivariate and variable return to scale (VRS). This paper reports practical experience of benchmarking software development tasks under multivariate and VRS constraints using Data Envelopment Analysis (DEA). The analysis of experience data in Institute of Software, Chinese Academy of Sciences (ISCAS) indicates that the ideas and techniques of benchmarking software projects can be deployed at the software development task level. Moreover, results also show that DEA VRS model allows the developers to gain new insight about how to identify the relatively efficient tasks as the task performance benchmark and how to establish different reference sets for each relatively inefficient task under multivariate and VRS constraints. We thus recommend DEA VRS model be used as the default technique for appropriately benchmarking software development tasks. Our results are beneficial to software process improvement. To the best of our knowledge, we believe that it is the first time to report such comprehensive and repeatable results of benchmarking software development tasks using DEA.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Stensrud, E., Myrtveit, I.: Identifying High Performance ERP Projects. IEEE Transactions on Software Engineering 29(5), 398–416 (2003)
Humphrey, W.S.: Introduction to the Team Software Process. Addison Wesley Professional, Reading (1999)
Stark, J.A., Crocker, R.: Trends in Software Process: The PSP and Agile Methods. IEEE Software 20(3), 89–91 (2003)
Maxwell, K.D., Forselius, P.: Benchmarking Software Development Productivity. IEEE Software 17(1), 80–88 (2000)
Farris, J.A., et al.: Evaluating the Relative Performance of Engineering Design Projects: A Case Study Using Data Envelopment Analysis. IEEE Transactions on Engineering Management 53(3), 471–482 (2006)
ISBSG: Worldwide Software Development – the Benchmark. Technical Report Release 5, International Software Benchmarking Standards Group (1998)
Myrtveit, I., Stensrud, E.: Benchmarking COTS Projects Using Data Envelopment Analysis. In: Sixth International Software Metrics Symposium, pp. 269–278 (1999)
Malaiya, Y.K., Denton, J.: Module Size Distribution and Defect Density. Software Reliability Engineering, 52–71 (2000)
Institute, P.M.: A Practice Standard for Earned Value Management, New Square, PA (2005)
Qing, W., Li, M.S.: Measuring and Improving Software Process in China. In: Proceedings of the 4th International Symposium on Empirical Software Engineering (ISESE’05), Australia, November 2005, pp. 17–18 (2005)
Scheel, H.: Undesirable Outputs in Efficiency Valuations. European Journal of Operation Reseach 132(2), 400–410 (2001)
Charnes, A., Cooper, W.W., Rhodes, E.: Measuring the Efficiency of Decision Making Units. European Journal of Operation Research 2, 429–444 (1978)
Banker, R.D., Charnes, A., Cooper, W.: Some Models for Estimating Technical and Scale Inefficiencies in Data Envelopment. Management Science 30, 1078–1092 (1984)
Author information
Authors and Affiliations
Editor information
Rights and permissions
Copyright information
© 2007 Springer Berlin Heidelberg
About this paper
Cite this paper
Ruan, L. et al. (2007). Empirical Study on Benchmarking Software Development Tasks. In: Wang, Q., Pfahl, D., Raffo, D.M. (eds) Software Process Dynamics and Agility. ICSP 2007. Lecture Notes in Computer Science, vol 4470. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-72426-1_19
Download citation
DOI: https://doi.org/10.1007/978-3-540-72426-1_19
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-72425-4
Online ISBN: 978-3-540-72426-1
eBook Packages: Computer ScienceComputer Science (R0)