Abstract
Currently, testing is still the most important approach to reduce the amount of software defects. Software quality metrics help to prioritize where additional testing is necessary by measuring the quality of the code. Most approaches to estimate whether some unit of code is sufficiently tested are based on code coverage, which measures what code fragments are exercised by the test suite. Unfortunately, code coverage does not measure to what extent the test suite checks the intended functionality.
We propose state coverage, a metric that measures the ratio of state updates that are read by assertions with respect to the total number of state updates, and we present efficient algorithms to measure state coverage. Like code coverage, state coverage is simple to understand and we show that it is effective to measure and easy to aggregate. During a preliminary evaluation on several open-source libraries, state coverage helped to identify multiple unchecked properties and detect several bugs.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Barnett, G., Del Tongo, L.: Data Structures and Algorithms: Annotated Reference with Examples. NETSlackers (2008)
Barnett, G., Del Tongo, L.: Data structures and algorithms, dsa (2008), http://dsa.codeplex.com/
Catal, C., Diri, B.: A systematic review of software fault prediction studies. Expert Systems with Applications 36(4), 7346–7354 (2009)
Chang, J., Richardson, D.J., Sankar, S.: Structural specification-based testing with adl. In: Proceedings of the 1996 ACM SIGSOFT International Symposium on Software Testing and Analysis, ISSTA 1996, New York, NY, USA, pp. 62–70 (1996)
Chidamber, S.R., Kemerer, C.F.: A metrics suite for object oriented design. IEEE Trans. Softw. Eng. 20(6), 476–493 (1994)
Dadeau, F., Ledru, Y., du Bousquet, L.: Measuring a java test suite coverage using jml specifications. Electronic Notes in Theoretical Computer Science 190(2), 21–32 (2007); Proceedings of the Third Workshop on Model Based Testing
de Halleux, J.: Quickgraph: A 100% c# graph library with graphviz support (2007), http://www.codeproject.com/KB/miscctrl/quickgraph.aspx
DeMillo, R.A., Lipton, R.J., Sayward, F.G.: Hints on test data selection: Help for the practicing programmer. Computer 11(4), 34–41 (1978)
Fähndrich, M., Barnett, M., Logozzo, F.: Embedded contract languages. In: SAC 2010: Proceedings of the 2010 ACM Symposium on Applied Computing, New York, NY, USA, pp. 2103–2110 (2010)
Floyd, R.W.: Assigning meanings to programs. Mathematical Aspects of Computer Science 19(19-32), 1 (1967)
Heimdahl, M.P., George, D., Weber, R.: Specification test coverage adequacy criteria = specification test generation inadequacy criteria? In: IEEE International Symposium on High-Assurance Systems Engineering, pp. 178–186 (2004)
Hoare, C.A.R.: Assertions: A personal perspective. IEEE Ann. Hist. Comput. 25(2), 14–25 (2003)
King, J.C.: Symbolic execution and program testing. Commun. ACM 19(7), 385–394 (1976)
Koster, K., Kao, D.: State coverage: a structural test adequacy criterion for behavior checking. In: The 6th Joint Meeting on European Software Engineering Conference and the ACM SIGSOFT Symposium on the Foundations of Software Engineering: Companion Papers, ESEC-FSE Companion 2007, New York, NY, USA, pp. 541–544 (2007)
Kudrjavets, G., Nagappan, N., Ball, T.: Assessing the relationship between software assertions and faults: An empirical investigation. In: ISSRE 2006: Proceedings of the 17th International Symposium on Software Reliability Engineering, pp. 204–212. IEEE Computer Society, Washington, DC, USA (2006)
McCabe, T.J.: A complexity measure. IEEE Trans. Softw. Eng. 2(4), 308–320 (1976)
N.I. of Standards and technology. The economic impacts of inadequate infrastructure for software testing. Planning Report 02-3 (2002)
Osherove, R.: The Art of Unit Testing with examples in .NET. Manning Publications Co. (2009)
Rapps, S., Weyuker, E.J.: Selecting software test data using data flow information. IEEE Trans. Softw. Eng. 11, 367–375 (1985)
Rosenblum, D.: A practical approach to programming with assertions. IEEE Transactions on Software Engineering 21(1), 19–31 (1995)
Sabelfeld, A., Myers, A.C.: Language-based information-flow security. IEEE Journal on Selected Areas in Communications 21(1), 5–19 (2003)
Song, Y., Thummalapenta, S., Xie, T.: Unitplus: assisting developer testing in eclipse. In: Eclipse 2007: Proceedings of the 2007 OOPSLA Workshop on Eclipse Technology Exchange, New York, NY, USA, pp. 26–30 (2007)
Taylor, R.N.: Assertions in programming languages. SIGPLAN Not. 15(1), 105–114 (1980)
Tillmann, N., de Halleux, J.: Pex–White Box Test Generation for .NET. In: Beckert, B., Hähnle, R. (eds.) TAP 2008. LNCS, vol. 4966, pp. 134–153. Springer, Heidelberg (2008)
Vanoverberghe, D., de Halleux, J., Tillmann, N., Piessens, F.: State coverage: Software validation metrics beyond code coverage - extended version (2011), http://www.cs.kuleuven.be/publicaties/rapporten/cw/CW610.abs.html
Zhu, H., Hall, P.A.V., May, J.H.R.: Software unit test coverage and adequacy. ACM Comput. Surv. 29, 366–427 (1997)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Vanoverberghe, D., de Halleux, J., Tillmann, N., Piessens, F. (2012). State Coverage: Software Validation Metrics beyond Code Coverage. In: Bieliková, M., Friedrich, G., Gottlob, G., Katzenbeisser, S., Turán, G. (eds) SOFSEM 2012: Theory and Practice of Computer Science. SOFSEM 2012. Lecture Notes in Computer Science, vol 7147. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-27660-6_44
Download citation
DOI: https://doi.org/10.1007/978-3-642-27660-6_44
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-27659-0
Online ISBN: 978-3-642-27660-6
eBook Packages: Computer ScienceComputer Science (R0)