ABSTRACT
Context: Systematic literature reviews have become common in software engineering in the last decade, but challenges remain.
Goal: Given the challenges, the objective is to describe improvement areas in writing primary studies, and hence provide a good basis for researchers aiming at synthesizing research evidence in a specific area.
Method: The results presented are based on a literature review with respect to synthesis of research results in software engineering with a particular focus on empirical software engineering. The literature review is complemented and exemplified with experiences from conducting systematic literature reviews and working with research methodologies in empirical software engineering.
Results: The paper presents three areas where improvements are needed to become more successful in synthesizing empirical evidence. These three areas are: terminology, paper content and reviewing.
Conclusion: It is concluded that it must be possible to improve the primary studies, but it requires that researchers start having synthesis in mind when writing their research papers.
- P. Brereton, B. A. Kitchenham, D. Budgen, M. Turner, and M. Khalil, "Lessons from applying the systematic literature review process within the software engineering domain," J. Syst. Softw., vol. 80, no. 4, pp. 571--583, 2007. Google ScholarDigital Library
- D. Budgen, B. A. Kitchenham, S. M. Charters, M. Turner, P. Brereton, and S. G. Linkman, "Presenting software engineering results using structured abstracts: A randomised experiment," Empir. Softw. Eng., vol. 13, no. 4, pp. 435--468, 2008. Google ScholarDigital Library
- D. S. Cruzes and T. Dybå, "Research synthesis in software engineering: A tertiary study," in Information and Software Technology, 2011, vol. 53, no. 5, pp. 440--455. Google ScholarDigital Library
- F. Q. B. da Silva, A. L. M. Santos, S. Soares, A. C. C. Franca, C. V. F. Monteiro, and F. F. Maciel, "Six years of systematic literature reviews in software engineering: An updated tertiary study," Inf. Softw. Technol., vol. 53, no. 9, pp. 899--913, 2011. Google ScholarDigital Library
- L. Guzmán, C. Lampasona, C. Seaman, and D. Rombach, "Survey on Research Synthesis in Software Engineering," in Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering (EASE 2014), 2014, pp. 13--22. Google ScholarDigital Library
- Edgar Hassler, Jeffrey C. Carver, Nicholas A. Kraft, and David Hale, "Outcomes of a Community Workshop to Identify and Rank Barriers to the Systematic Literature Review Process," in Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering (EASE 2014), 2014, pp. 267--276. Google ScholarDigital Library
- S. Jalali and C. Wohlin, "Systematic literature studies: database searches vs. backward snowballing," in Proceedings of the ACM-IEEE international symposium on Empirical software engineering and measurement, 2012, pp. 29--38. Google ScholarDigital Library
- S. Jalali and C. Wohlin, "Global software engineering and agile practices: a systematic review," J. Softw. Evol. Process, vol. 24, no. 6, pp. 643--659, 2012.Google ScholarCross Ref
- B. A. Kitchenham, T. Dybå, and M. Jørgensen, "Evidence-based software engineering," in Proceedings of the 26th International Conference on Software Engineering, 2004, pp. 273--281. Google ScholarDigital Library
- B. A. Kitchenham and S. Charters, "Guidelines for performing Systematic Literature Reviews in Software Engineering," 2007.Google Scholar
- K. Petersen and C. Wohlin, "Context in industrial software engineering research," in 2009 3rd International Symposium on Empirical Software Engineering and Measurement, 2009, pp. 401--404. Google ScholarDigital Library
- P. Runeson and M. Höst, "Guidelines for conducting and reporting case study research in software engineering," Empir. Softw. Eng., vol. 14, no. 2, pp. 131--164, 2008. Google ScholarDigital Library
- M. Shaw, "Writing Good Software Engineering Research Papers," in Proceedings of 25th International Conference on Software Engineering (ICSE'03), 2003, pp. 726--736. Google ScholarDigital Library
- D. Smite, C. Wohlin, R. Feldt, and T. Gorschek, "Reporting empirical research in global software engineering: a classification scheme," in Global Software Engineering, 2008. ICGSE 2008. IEEE International Conference on, 2008, pp. 173--181. Google ScholarDigital Library
- D. Šmite, C. Wohlin, T. Gorschek, and R. Feldt, "Empirical evidence in global software engineering: a systematic review," Empir. Softw. Eng., vol. 15, no. 1, pp. 91--118, 2010. Google ScholarDigital Library
- D. Šmite, C. Wohlin, Z. Galvina, and R. Prikladnicki, "An empirically based terminology and taxonomy for global software engineering," Empir. Softw. Eng., vol. 19, no. 1, pp. 105--153, 2014. Google ScholarDigital Library
- C. Wohlin, P. Runeson, P. A. da Mota Silveira Neto, E. Engström, I. do Carmo Machado, and E. S. de Almeida, "On the Reliability of Mapping Studies in Software Engineering," J. Syst. Softw., vol. 86, no. 10, pp. 2594--2610, 2013.Google ScholarCross Ref
- C. Wohlin, "Guidelines for Snowballing in Systematic Literature Studies and a Replication in Software Engineering," in Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering (EASE 2014), 2014, pp. 321--330. Google ScholarDigital Library
- C. Wohlin and A. Aurum, "Towards a Decision-making Structure for Selecting a Research Design in Empirical Software Engineering," Empir. Softw. Eng., vol. 0, p. 0, 2014. Available from articles in press. DOI= http://dx.doi.org/10.1007/s10664-014-9319-7Google Scholar
- R. K. Yin, Case Study Research: Design and Methods, vol. 5, no. 5. Sage Publications, 2009, p. 219.Google Scholar
Index Terms
- Writing for synthesis of evidence in empirical software engineering
Recommendations
Guidelines for snowballing in systematic literature studies and a replication in software engineering
EASE '14: Proceedings of the 18th International Conference on Evaluation and Assessment in Software EngineeringBackground: Systematic literature studies have become common in software engineering, and hence it is important to understand how to conduct them efficiently and reliably.
Objective: This paper presents guidelines for conducting literature reviews using ...
Experience-based guidelines for effective and efficient data extraction in systematic reviews in software engineering
EASE '17: Proceedings of the 21st International Conference on Evaluation and Assessment in Software EngineeringTo systematically collect evidence and to structure a given area in software engineering (SE), Systematic Literature Reviews (SLR) and Systematic Mapping (SM) studies have become common. Data extraction is one of the main phases (activities) when ...
The need for multivocal literature reviews in software engineering: complementing systematic literature reviews with grey literature
EASE '16: Proceedings of the 20th International Conference on Evaluation and Assessment in Software EngineeringSystematic Literature Reviews (SLR) may not provide insight into the "state of the practice" in SE, as they do not typically include the "grey" (non-published) literature. A Multivocal Literature Review (MLR) is a form of a SLR which includes grey ...
Comments