skip to main content
10.1145/3338906.3340459acmconferencesArticle/Chapter ViewAbstractPublication PagesfseConference Proceedingsconference-collections
research-article

Code coverage at Google

Published: 12 August 2019 Publication History

Abstract

Code coverage is a measure of the degree to which a test suite exercises a software system. Although coverage is well established in software engineering research, deployment in industry is often inhibited by the perceived usefulness and the computational costs of analyzing coverage at scale. At Google, coverage information is computed for one billion lines of code daily, for seven programming languages. A key aspect of making coverage information actionable is to apply it at the level of changesets and code review. This paper describes Google’s code coverage infrastructure and how the computed code coverage information is visualized and used. It also describes the challenges and solutions for adopting code coverage at scale. To study how code coverage is adopted and perceived by developers, this paper analyzes adoption rates, error rates, and average code coverage ratios over a five-year period, and it reports on 512 responses, received from surveying 3000 developers. Finally, this paper provides concrete suggestions for how to implement and use code coverage in an industrial setting.

References

[1]
Codesearch. https://cs.chromium.org/.
[2]
Coverage.py. https://coverage.readthedocs.io/en/v4.5.x/.
[3]
Dart - Coverage. https://github.com/dartlang/coverage.
[4]
Istanbul Code Coverage. https://github.com/istanbuljs.
[5]
JaCoCo Java Code Coverage Library. https://www.eclemma.org/jacoco/.
[6]
The Go Blog - The cover story. https://blog.golang.org/cover.
[7]
Using the GNU Compiler Collection (GCC): Gcov. https://gcc.gnu.org/onlinedocs/ gcc/Gcov.html.
[8]
vim-coverage. https://github.com/google/vimcoverage.
[9]
Adler, Y., Behar, N., Raz, O., Shehory, O., Steindler, N., Ur, S., and Zlotnick, A. Code coverage analysis in practice for large systems. In 2011 33rd International Conference on Software Engineering (ICSE) (2011), IEEE, pp. 736–745.
[10]
Chen, B., Song, J., Xu, P., Hu, X., and Jiang, Z. M. J. An automated approach to estimating code coverage measures via execution logs. In Proceedings of the 33rd ACM/IEEE International Conference on Automated Software Engineering (New York, NY, USA, 2018), ASE 2018, ACM, pp. 305–316.
[11]
Chilenski, J. J., and Miller, S. P. Applicability of modified condition/decision coverage to software testing. Software Engineering Journal 9, 5 (September 1994), 193–200.
[12]
Elbaum, S., Rothermel, G., and Penix, J. Techniques for improving regression testing in continuous integration development environments. In Proceedings of the 22Nd ACM SIGSOFT International Symposium on Foundations of Software Engineering (New York, NY, USA, 2014), FSE 2014, ACM, pp. 235–245.
[13]
Elmendorf, W. R. Controlling the functional testing of an operating system. IEEE Transactions on Systems Science and Cybernetics 5, 4 (1969), 284–290.
[14]
Gopinath, R., Jensen, C., and Groce, A. Code coverage for suite evaluation by developers. In Proceedings of the 36th International Conference on Software Engineering (2014), ACM, pp. 72–82.
[15]
Inc., G. Google C++ Style Guide. https://google.github.io/styleguide/cppguide. html.
[16]
Inc., G. Introducing "Testing on the Toilet". https://testing.googleblog.com/2007/ 01/introducingtestingontoilet.html, Jan. 2007.
[17]
Inc., G. Bazel build system. https://bazel.io/, 2015.
[18]
Inozemtseva, L., and Holmes, R. Coverage is not strongly correlated with test suite effectiveness. In Proceedings of the 36th International Conference on Software Engineering (2014), ACM, pp. 435–445. Code Coverage at Google ESEC/FSE ’19, August 26–30, 2019, Tallinn, Estonia
[19]
Just, R., Jalali, D., Inozemtseva, L., Ernst, M. D., Holmes, R., and Fraser, G. Are mutants a valid substitute for real faults in software testing? In Proceedings of the Symposium on the Foundations of Software Engineering (FSE) (Nov. 2014), pp. 654–665.
[20]
Kim, Y. W. Efficient use of code coverage in large-scale software development. In Proceedings of the 2003 conference of the Centre for Advanced Studies on Collaborative research (2003), IBM Press, pp. 145–155.
[21]
Li, N., Meng, X., Offutt, J., and Deng, L. Is bytecode instrumentation as good as source code instrumentation: An empirical study with industrial tools (experience report). In 2013 IEEE 24th International Symposium on Software Reliability Engineering (ISSRE) (2013), IEEE, pp. 380–389.
[22]
Melnik, S., Gubarev, A., Long, J. J., Romer, G., Shivakumar, S., Tolton, M., and Vassilakis, T. Dremel: Interactive analysis of web-scale datasets. In Proc. of the 36th Int’l Conf on Very Large Data Bases (2010), pp. 330–339.
[23]
Petrović, G., and Ivanković, M. State of mutation testing at Google. In Proceedings of the International Conference on Software Engineering—Software Engineering in Practice (ICSE SEIP) (May 2018).
[24]
Petrović, G., Ivanković, M., Kurtz, B., Ammann, P., and Just, R. An industrial application of mutation testing: Lessons, challenges, and research directions. In Proceedings of the International Workshop on Mutation Analysis (Mutation) (Apr. 2018), pp. 47–53.
[25]
Piwowarski, P., Ohba, M., and Caruso, J. Coverage measurement experience during function test. In Proceedings of the 15th international conference on Software Engineering (1993), IEEE Computer Society Press, pp. 287–301.
[26]
Yang, Q., Li, J. J., and Weiss, D. M. A survey of coverage-based testing tools. The Computer Journal 52, 5 (2009), 589–597.

Cited By

View all
  • (2024)WhisperFuzzProceedings of the 33rd USENIX Conference on Security Symposium10.5555/3698900.3699201(5377-5394)Online publication date: 14-Aug-2024
  • (2024)R2EProceedings of the 41st International Conference on Machine Learning10.5555/3692070.3692922(21196-21224)Online publication date: 21-Jul-2024
  • (2024)On the Analysis of Coverage Feedback in a Fuzzing Proprietary SystemApplied Sciences10.3390/app1413593914:13(5939)Online publication date: 8-Jul-2024
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ESEC/FSE 2019: Proceedings of the 2019 27th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering
August 2019
1264 pages
ISBN:9781450355728
DOI:10.1145/3338906
This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 12 August 2019

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. coverage
  2. industrial study
  3. test infrastructure

Qualifiers

  • Research-article

Conference

ESEC/FSE '19
Sponsor:

Acceptance Rates

Overall Acceptance Rate 112 of 543 submissions, 21%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)222
  • Downloads (Last 6 weeks)9
Reflects downloads up to 05 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)WhisperFuzzProceedings of the 33rd USENIX Conference on Security Symposium10.5555/3698900.3699201(5377-5394)Online publication date: 14-Aug-2024
  • (2024)R2EProceedings of the 41st International Conference on Machine Learning10.5555/3692070.3692922(21196-21224)Online publication date: 21-Jul-2024
  • (2024)On the Analysis of Coverage Feedback in a Fuzzing Proprietary SystemApplied Sciences10.3390/app1413593914:13(5939)Online publication date: 8-Jul-2024
  • (2024)Element and Event-Based Test Suite Reduction for Android Test Suites Generated by Reinforcement Learningundefined10.12794/metadc2356236Online publication date: Jul-2024
  • (2024)Software Development Practices and Tools for University-Industry R&D projectsProceedings of the XXIII Brazilian Symposium on Software Quality10.1145/3701625.3701627(426-437)Online publication date: 5-Nov-2024
  • (2024)Efficient Incremental Code Coverage Analysis for Regression Test SuitesProceedings of the 39th IEEE/ACM International Conference on Automated Software Engineering10.1145/3691620.3695551(1882-1894)Online publication date: 27-Oct-2024
  • (2024)A Pilot Study in Surveying Data Challenges of Automatic Software Engineering TasksProceedings of the 4th International Workshop on Software Engineering and AI for Data Quality in Cyber-Physical Systems/Internet of Things10.1145/3663530.3665020(6-11)Online publication date: 15-Jul-2024
  • (2024)An Empirical Study on Code Coverage of Performance TestingProceedings of the 28th International Conference on Evaluation and Assessment in Software Engineering10.1145/3661167.3661196(48-57)Online publication date: 18-Jun-2024
  • (2024)Bounding Random Test Set Size with Computational Learning TheoryProceedings of the ACM on Software Engineering10.1145/36608191:FSE(2538-2560)Online publication date: 12-Jul-2024
  • (2024)A Quantitative and Qualitative Evaluation of LLM-Based Explainable Fault LocalizationProceedings of the ACM on Software Engineering10.1145/36607711:FSE(1424-1446)Online publication date: 12-Jul-2024
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media