Abstract
[Context and motivation] Experiments are an important means to evaluate research results in the field of requirements engineering. Researchers often conduct such experiments with student participants. [Question/problem] The use of student participants evokes a multitude of potential threats to validity, which must be properly addressed by the chosen experiment design. In practice, attention is mostly given to threats to the generalizability of the findings. However, current experiment reports often lack a proper discussion of further threats, for example, which are caused by the recruitment of student participants. [Principle ideas/results] To provide mitigation strategies for student specific threats to validity, these threats must be known. We analyzed student experiments from published experiment reports to identify student specific threats and to analyze adequate mitigation strategies. [Contribution] This paper contributes a detailed analysis of the threats to validity to be considered in student experiments, and possible mitigation strategies to avoid these threats. In addition, we report on an experiment conducted in a university requirements engineering course, where we considered student specific threats and applied the proposed mitigation strategies.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Port, D., Klappholz, D.: Empirical research in the software engineering classroom. In: Proceedings of the 17th IEEE Conference on Software Engineering Education and Training, pp. 132–137 (2004)
Boehm, B., Koolmanojwong, S.: Combining software engineering education and empirical research via instrumented real-client team project courses. In: Proceedings of the 17th IEEE Conference on Software Engineering Education and Training, pp. 209–211 (2014)
Höst, M.: Introducing empirical software engineering methods in education. In: Proceedings of the 15th IEEE Conference on Software Engineering Education and Training, pp. 170–179 (2002)
Siegmund, J., Siegmund, N., Apel, S.: Views on internal and external validity in empirical software engineering. In: Proceedings of the 37th IEEE/ACM International Conference on Software Engineering, ICSE 2015, pp. 9–19. IEEE, Florence, Italy, 16–24 May 2015. ISBN 978-1-4799-1934-5
Salman, I., Misirli, A.T., Juzgado, N.J.: Are students representatives of professionals in software engineering experiments? In: Proceedings of the 37th IEEE/ACM International Conference on Software Engineering, ICSE 2015, pp. 666–676. IEEE, Florence, Italy, 16–24 May 2015. ISBN 978-1-4799-1934-5
Wieringa, R.: Empirical research methods for technology validation: scaling up to practice. J. Syst. Softw. 95, 19–31 (2014)
Carver, J., Jaccheri, L., Morasca, S., Shull, F.: Issues in using students in empirical studies in software engineering education. In: Proceedings of Software Metrics Symposium, pp. 239–249 (2003)
Sjøberg, D., Hannay, J., Hansen, O., Kampenes, V., Karahasanovic, A., Liborg, N., Rekdal, A.: A survey of controlled experiments in software engineering. IEEE Trans. Softw. Eng. 31, 733–753 (2005)
Sjøberg, D., Anda, B., Arisholm, E., Dybå, T., Jørgensen, M., Karahasanovic, A., Koren, E., Vokác, M.: Conducting realistic experiments in software engineering. In: Proceedings of the International Symposium on Empirical Software Engineering, pp. 17–26 (2002)
Berry, D., Tichy, W.: Response to ‘comments on formal methods application: an empirical tale of software development’. IEEE Trans. Softw. Eng. 29(6), 572–575 (2003)
Höst, M., Regnell, B., Wohlin, C.: Using students as subjects - a comparative study of students and professionals in lead time impact assessment. J. Empirical Softw. Eng. 5, 201–214 (2000)
Svahnberg, M., Aurum, A., Wohlin, C.: Using students as subjects – an empirical evaluation. In: Proceedings of the International Symposium on Empirical Software Engineering and Measurement, pp. 288–290 (2008)
Runeson, P.: Using students as experiment subjects - an analysis on graduate and freshmen PSP student data. In: Proceedings of the International Conference on Evaluation and Assessment in Software Engineering, pp. 95–102 (2003)
Tichy, W.: Hints for reviewing empirical work in software engineering. J. Empirical Softw. Eng. 5, 309–312 (2000)
Wohlin, C., Runeson, P., Höst, M., Ohlsson, M., Regnell, B., Wesslén, A.: Experimentation in Software Engineering. An Introduction. Kluwer Academic Publishers, Boston (2000)
Neto, A., Conte, T.: A conceptual model to address threats to validity in controlled experiments. In: Proceedings of the 17th International Conference on Evaluation and Assessment in Software Engineering, pp. 82–85 (2013)
Robson, C.: Real World Research, 3rd edn. Wiley, Hoboken (2011)
Campbell, D., Stanley, J.: Experimental and Quasi-Experimental Designs for Research. Houghton Mifflin Company, Boston (1963)
Cook, T., Campbell, D.: Quasi-Experimentation - Design and Analysis Issues for Field Settings. Houghton Mifflin Company, Boston (1979)
Nugroho, A.: Level of detail in UML models and its impact on model comprehension: a controlled experiment. J. Inf. Softw. Technol. 51, 1670–1685 (2009)
Espana, S., Condori-Fernandez, N., Gonzalez, A., Pastor, O.: Evaluating the completeness and granularity of functional requirements specifications: a controlled experiment. In: Proceedings of the International Conference on Requirements Engineering, pp. 161–170 (2009)
Genero, M., Manso, E., Visaggio, A., Canfora, G., Piattini, M.: Building measure-based prediction models for UML class diagram maintainability. J. Empirical Softw. Eng. 12, 517–549 (2007)
Karoulis, A., Stamelos, I., Anglis, L., Pombortsis, A.: Formally assessing an instructional tool: a controlled experiment in software engineering. IEEE Trans. Educ. 48, 133–139 (2005)
Genero, M., Cruz-Lemus, J., Caivano, D., Abrahao, S., Insfran, E., Carsi, J.: Assessing the influence of stereotypes on the comprehension of UML sequence diagrams: a controlled experiment. In: Czarnecki, K., Ober, I., Bruel, J.-M., Uhl, A., Völter, M. (eds.) MODELS 2008. LNCS, vol. 5301, pp. 280–294. Springer, Heidelberg (2008)
Arisholm, E., Sjoberg, D., Jorgensen, M.: Assessing the changeability of two object-oriented design alternatives - a controlled experiment. J. Empirical Softw. Eng. 6, 231–277 (2001)
Runeson, P., Höst, M.: Guidlines for conducting and reporting case study research in software engineering. J. Empirical Softw. Eng. 14, 131–164 (2009)
Carver, J., Jaccheri, L., Morasca, S., Shull, F.: A checklist for integrating student empirical studies with research and teaching goals. J. Empirical Softw. Eng. 15, 35–59 (2009)
Ricca, F., Di Pinta, M., Torchiano, M., Tonella, P., Ceccato, M.: The role of experience and ability in comprehension tasks supported by UML stereotypes. In: Proceedings of the International Conference on Software Engineering, ICSE (2007)
Aceituna, D., Gursimran, W., Do, H., Lee, S.: Model-based requirements verification method: conclusions from two controlled experiments. J. Inf. Softw. Technol. 56, 321–334 (2014)
Berander, P.: Using students as subjects in requirements prioritization. In: Proceedings of the International Symposium on Empirical Software Engineering ISESE 2004, pp. 167–176 (2004)
Daun, M., Salmon, A., Tenbergen, B., Weyer, T., Pohl, K.: Industrial case studies in graduate requirements engineering courses: the impact on student motivation. In: Proceedings of the IEEE Conference on Software Engineering Education and Training, pp. 3–12 (2014)
Miller, J., Wood, M., Roper, M.: Further experiences with scenarios and checklists. J. Empirical Softw. Eng. 3, 37–64 (1998)
Basili, V., Green, S., Laitenberger, O.L.F., Shull, F., Sorumgard, S., Zelkowski, M.: The empirical investigation of perspective-based reading. J. Empirical Softw. Eng. 1, 133–164 (1996)
Porter, A., Votta, L., Basili, V.: Comparing detection methods for software requirement inspection: a replicated experiment. IEEE Trans. Softw. Eng. 21, 563–575 (1994)
Laitenberger, O.L.F., Emam, K., Harbich, T.: An internally replicated quasi-experimental comparison of checklist and perspective-based reading of code documents. IEEE Trans. Softw. Eng. 27, 387–421 (2001)
Berling, T., Runeson, P.: Evaluation of a perspective based review method applied in an industrial setting. IEEE Proc. Softw. 150, 177–184 (2003)
Sabaliauskaite, G., Kusumoto, S., Inoue, K.: Assessing defect detection performance of interacting teams in object-oriented design inspection. J. Inf. Softw. Technol. 46, 875–886 (2004)
Daun, M., Weyer, T., Pohl, K.: Detecting and correcting outdated requirements in function-centered engineering of embedded systems. In: Fricker, S.A., Schneider, K. (eds.) REFSQ 2015. LNCS, vol. 9013, pp. 65–80. Springer, Heidelberg (2015)
Daun, M., Salmon, A., Weyer, T., Pohl, K.: The impact of students’ skills and experiences on empirical results: a controlled experiment with undergraduate and graduate students. In: Proceedings of the 19th International Conference on Evaluation and Assessment in Software Engineering, pp. 29:1–29:6 (2015)
Daun, M., Höfflinger, J., Weyer, T.: Function-centered engineering of embedded systems: evaluating industry needs and possible solutions. In: Proceedings of the International Conference on Evaluation and Assessment of Novel Approaches to Software Engineering (2014)
Acknowledgements
This research was partly funded by the German Federal Ministry of Education and Research under grant no. 01IS12005C and grant no. 01IS15058C. Thanks to our industrial partners for their support in creating the experiment material used. In particular we thank Peter Heidl, Jens Höfflinger and John MacGregor (Bosch), Frank Houdek (Daimler), and Stefan Beck and Arnaud Boyer (Airbus).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this paper
Cite this paper
Daun, M., Salmon, A., Bandyszak, T., Weyer, T. (2016). Common Threats and Mitigation Strategies in Requirements Engineering Experiments with Student Participants. In: Daneva, M., Pastor, O. (eds) Requirements Engineering: Foundation for Software Quality. REFSQ 2016. Lecture Notes in Computer Science(), vol 9619. Springer, Cham. https://doi.org/10.1007/978-3-319-30282-9_19
Download citation
DOI: https://doi.org/10.1007/978-3-319-30282-9_19
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-30281-2
Online ISBN: 978-3-319-30282-9
eBook Packages: Computer ScienceComputer Science (R0)