skip to main content
10.1145/2652524.2652559acmconferencesArticle/Chapter ViewAbstractPublication PagesesemConference Proceedingsconference-collections
research-article

Writing for synthesis of evidence in empirical software engineering

Published:18 September 2014Publication History

ABSTRACT

Context: Systematic literature reviews have become common in software engineering in the last decade, but challenges remain.

Goal: Given the challenges, the objective is to describe improvement areas in writing primary studies, and hence provide a good basis for researchers aiming at synthesizing research evidence in a specific area.

Method: The results presented are based on a literature review with respect to synthesis of research results in software engineering with a particular focus on empirical software engineering. The literature review is complemented and exemplified with experiences from conducting systematic literature reviews and working with research methodologies in empirical software engineering.

Results: The paper presents three areas where improvements are needed to become more successful in synthesizing empirical evidence. These three areas are: terminology, paper content and reviewing.

Conclusion: It is concluded that it must be possible to improve the primary studies, but it requires that researchers start having synthesis in mind when writing their research papers.

References

  1. P. Brereton, B. A. Kitchenham, D. Budgen, M. Turner, and M. Khalil, "Lessons from applying the systematic literature review process within the software engineering domain," J. Syst. Softw., vol. 80, no. 4, pp. 571--583, 2007. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. D. Budgen, B. A. Kitchenham, S. M. Charters, M. Turner, P. Brereton, and S. G. Linkman, "Presenting software engineering results using structured abstracts: A randomised experiment," Empir. Softw. Eng., vol. 13, no. 4, pp. 435--468, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. D. S. Cruzes and T. Dybå, "Research synthesis in software engineering: A tertiary study," in Information and Software Technology, 2011, vol. 53, no. 5, pp. 440--455. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. F. Q. B. da Silva, A. L. M. Santos, S. Soares, A. C. C. Franca, C. V. F. Monteiro, and F. F. Maciel, "Six years of systematic literature reviews in software engineering: An updated tertiary study," Inf. Softw. Technol., vol. 53, no. 9, pp. 899--913, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. L. Guzmán, C. Lampasona, C. Seaman, and D. Rombach, "Survey on Research Synthesis in Software Engineering," in Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering (EASE 2014), 2014, pp. 13--22. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Edgar Hassler, Jeffrey C. Carver, Nicholas A. Kraft, and David Hale, "Outcomes of a Community Workshop to Identify and Rank Barriers to the Systematic Literature Review Process," in Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering (EASE 2014), 2014, pp. 267--276. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. S. Jalali and C. Wohlin, "Systematic literature studies: database searches vs. backward snowballing," in Proceedings of the ACM-IEEE international symposium on Empirical software engineering and measurement, 2012, pp. 29--38. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. S. Jalali and C. Wohlin, "Global software engineering and agile practices: a systematic review," J. Softw. Evol. Process, vol. 24, no. 6, pp. 643--659, 2012.Google ScholarGoogle ScholarCross RefCross Ref
  9. B. A. Kitchenham, T. Dybå, and M. Jørgensen, "Evidence-based software engineering," in Proceedings of the 26th International Conference on Software Engineering, 2004, pp. 273--281. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. B. A. Kitchenham and S. Charters, "Guidelines for performing Systematic Literature Reviews in Software Engineering," 2007.Google ScholarGoogle Scholar
  11. K. Petersen and C. Wohlin, "Context in industrial software engineering research," in 2009 3rd International Symposium on Empirical Software Engineering and Measurement, 2009, pp. 401--404. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. P. Runeson and M. Höst, "Guidelines for conducting and reporting case study research in software engineering," Empir. Softw. Eng., vol. 14, no. 2, pp. 131--164, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. M. Shaw, "Writing Good Software Engineering Research Papers," in Proceedings of 25th International Conference on Software Engineering (ICSE'03), 2003, pp. 726--736. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. D. Smite, C. Wohlin, R. Feldt, and T. Gorschek, "Reporting empirical research in global software engineering: a classification scheme," in Global Software Engineering, 2008. ICGSE 2008. IEEE International Conference on, 2008, pp. 173--181. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. D. Šmite, C. Wohlin, T. Gorschek, and R. Feldt, "Empirical evidence in global software engineering: a systematic review," Empir. Softw. Eng., vol. 15, no. 1, pp. 91--118, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. D. Šmite, C. Wohlin, Z. Galvina, and R. Prikladnicki, "An empirically based terminology and taxonomy for global software engineering," Empir. Softw. Eng., vol. 19, no. 1, pp. 105--153, 2014. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. C. Wohlin, P. Runeson, P. A. da Mota Silveira Neto, E. Engström, I. do Carmo Machado, and E. S. de Almeida, "On the Reliability of Mapping Studies in Software Engineering," J. Syst. Softw., vol. 86, no. 10, pp. 2594--2610, 2013.Google ScholarGoogle ScholarCross RefCross Ref
  18. C. Wohlin, "Guidelines for Snowballing in Systematic Literature Studies and a Replication in Software Engineering," in Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering (EASE 2014), 2014, pp. 321--330. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. C. Wohlin and A. Aurum, "Towards a Decision-making Structure for Selecting a Research Design in Empirical Software Engineering," Empir. Softw. Eng., vol. 0, p. 0, 2014. Available from articles in press. DOI= http://dx.doi.org/10.1007/s10664-014-9319-7Google ScholarGoogle Scholar
  20. R. K. Yin, Case Study Research: Design and Methods, vol. 5, no. 5. Sage Publications, 2009, p. 219.Google ScholarGoogle Scholar

Index Terms

  1. Writing for synthesis of evidence in empirical software engineering

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ESEM '14: Proceedings of the 8th ACM/IEEE International Symposium on Empirical Software Engineering and Measurement
      September 2014
      461 pages
      ISBN:9781450327749
      DOI:10.1145/2652524

      Copyright © 2014 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 18 September 2014

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      ESEM '14 Paper Acceptance Rate23of123submissions,19%Overall Acceptance Rate130of594submissions,22%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader