skip to main content
10.1145/3319008.3319708acmotherconferencesArticle/Chapter ViewAbstractPublication PageseaseConference Proceedingsconference-collections
short-paper

Do software engineering practitioners cite software testing research in their online articles?: A larger scale replication

Authors Info & Claims
Published:15 April 2019Publication History

ABSTRACT

Background: Software engineering (SE) research continues to study the degree to which practitioners perceive research as relevant to practice. Such studies typically comprise surveys of practitioner opinions. In a preliminary, and relatively small scale, study of online articles we previously found few explicit citations to software testing research. Our previous study provided an in situ complement to the typical survey study, however the findings of the previous study were limited by the size of our sample.

Objective: To further investigate whether and how practitioners cite software testing research in the grey literature, by using a larger and more diverse dataset.

Method: We analyse four distinct datasets totalling over 400,000 online articles with approx. 2M external citations. Two datasets were generated by crawling predefined domains and two were generated by applying heuristics, developed in prior research, in Google searches. Citations are classified and then analysed.

Results: We find a (very) low percentage of citations to research.

Conclusion: Our replication corroborates our preliminary study and findings from others. In relative terms, topic--specific searches appear to return results that contain articles with more citations. Our results and method provide a basis for benchmarking.

References

  1. Kilim Choi. {n. d.}. Software Engineering Blogs. ({n. d.}). https://github.com/kilimchoi/engineering-blogs#-individualsGoogle ScholarGoogle Scholar
  2. Premkumar Devanbu, Thomas Zimmermann, and Christian Bird. 2016. Belief & evidence in empirical software engineering. In Software Engineering (ICSE), 2016 IEEE/ACM 38th International Conference on. IEEE, 108--119. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Vahid Garousi, Michael Felderer, and Mika V Mäntylä. 2016. The need for multivocal literature reviews in software engineering: complementing systematic literature reviews with grey literature. In Proceedings of the 20th International Conference on Evaluation and Assessment in Software Engineering. ACM, 26. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. Vahid Garousi and Mika V Mäntylä. 2016. When and what to automate in software testing? A multi-vocal literature review. Information and Software Technology 76 (2016), 92--117. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Vahid Garousi and Junji Zhi. 2013. A survey of software testing practices in Canada. Journal of Systems and Software 86, 5 (2013), 1354--1376. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. David Lo, Nachiappan Nagappan, and Thomas Zimmermann. 2015. How practitioners perceive the relevance of software engineering research. In Proceedings of the 2015 10th Joint Meeting on Foundations of Software Engineering. ACM, 415--425. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Austen Rainer and Ashley Williams. 2019. Heuristics for improving the rigour and relevance of grey literature searches for software engineering research. Information and Software Technology 106 (2019), 231--233.Google ScholarGoogle ScholarCross RefCross Ref
  8. Ashley Williams. 2018. Do software engineering practitioners cite research on software testing in their online articles?: A preliminary survey.. In Proceedings of the 22nd International Conference on Evaluation and Assessment in Software Engineering 2018. ACM, 151--156. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Ashley Williams. 2018. Using reasoning markers to select the more rigorous software practitioners' online content when searching for grey literature. In Proceedings of the 22nd International Conference on Evaluation and Assessment in Software Engineering 2018. ACM, 46--56. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Ashley Williams and Austen Rainer. 2016. Identifying practitioners' arguments and evidence in blogs: insights from a pilot study. In Software Engineering Conference (APSEC), 2016 23rd Asia-Pacific. IEEE, 345--348.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Do software engineering practitioners cite software testing research in their online articles?: A larger scale replication

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Other conferences
          EASE '19: Proceedings of the 23rd International Conference on Evaluation and Assessment in Software Engineering
          April 2019
          345 pages
          ISBN:9781450371452
          DOI:10.1145/3319008

          Copyright © 2019 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 15 April 2019

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • short-paper
          • Research
          • Refereed limited

          Acceptance Rates

          EASE '19 Paper Acceptance Rate20of73submissions,27%Overall Acceptance Rate71of232submissions,31%

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader