Skip to main content

Common Threats and Mitigation Strategies in Requirements Engineering Experiments with Student Participants

  • Conference paper
  • First Online:
Book cover Requirements Engineering: Foundation for Software Quality (REFSQ 2016)

Part of the book series: Lecture Notes in Computer Science ((LNPSE,volume 9619))

Abstract

[Context and motivation] Experiments are an important means to evaluate research results in the field of requirements engineering. Researchers often conduct such experiments with student participants. [Question/problem] The use of student participants evokes a multitude of potential threats to validity, which must be properly addressed by the chosen experiment design. In practice, attention is mostly given to threats to the generalizability of the findings. However, current experiment reports often lack a proper discussion of further threats, for example, which are caused by the recruitment of student participants. [Principle ideas/results] To provide mitigation strategies for student specific threats to validity, these threats must be known. We analyzed student experiments from published experiment reports to identify student specific threats and to analyze adequate mitigation strategies. [Contribution] This paper contributes a detailed analysis of the threats to validity to be considered in student experiments, and possible mitigation strategies to avoid these threats. In addition, we report on an experiment conducted in a university requirements engineering course, where we considered student specific threats and applied the proposed mitigation strategies.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Port, D., Klappholz, D.: Empirical research in the software engineering classroom. In: Proceedings of the 17th IEEE Conference on Software Engineering Education and Training, pp. 132–137 (2004)

    Google Scholar 

  2. Boehm, B., Koolmanojwong, S.: Combining software engineering education and empirical research via instrumented real-client team project courses. In: Proceedings of the 17th IEEE Conference on Software Engineering Education and Training, pp. 209–211 (2014)

    Google Scholar 

  3. Höst, M.: Introducing empirical software engineering methods in education. In: Proceedings of the 15th IEEE Conference on Software Engineering Education and Training, pp. 170–179 (2002)

    Google Scholar 

  4. Siegmund, J., Siegmund, N., Apel, S.: Views on internal and external validity in empirical software engineering. In: Proceedings of the 37th IEEE/ACM International Conference on Software Engineering, ICSE 2015, pp. 9–19. IEEE, Florence, Italy, 16–24 May 2015. ISBN 978-1-4799-1934-5

    Google Scholar 

  5. Salman, I., Misirli, A.T., Juzgado, N.J.: Are students representatives of professionals in software engineering experiments? In: Proceedings of the 37th IEEE/ACM International Conference on Software Engineering, ICSE 2015, pp. 666–676. IEEE, Florence, Italy, 16–24 May 2015. ISBN 978-1-4799-1934-5

    Google Scholar 

  6. Wieringa, R.: Empirical research methods for technology validation: scaling up to practice. J. Syst. Softw. 95, 19–31 (2014)

    Article  Google Scholar 

  7. Carver, J., Jaccheri, L., Morasca, S., Shull, F.: Issues in using students in empirical studies in software engineering education. In: Proceedings of Software Metrics Symposium, pp. 239–249 (2003)

    Google Scholar 

  8. Sjøberg, D., Hannay, J., Hansen, O., Kampenes, V., Karahasanovic, A., Liborg, N., Rekdal, A.: A survey of controlled experiments in software engineering. IEEE Trans. Softw. Eng. 31, 733–753 (2005)

    Article  Google Scholar 

  9. Sjøberg, D., Anda, B., Arisholm, E., Dybå, T., Jørgensen, M., Karahasanovic, A., Koren, E., Vokác, M.: Conducting realistic experiments in software engineering. In: Proceedings of the International Symposium on Empirical Software Engineering, pp. 17–26 (2002)

    Google Scholar 

  10. Berry, D., Tichy, W.: Response to ‘comments on formal methods application: an empirical tale of software development’. IEEE Trans. Softw. Eng. 29(6), 572–575 (2003)

    Article  Google Scholar 

  11. Höst, M., Regnell, B., Wohlin, C.: Using students as subjects - a comparative study of students and professionals in lead time impact assessment. J. Empirical Softw. Eng. 5, 201–214 (2000)

    Article  MATH  Google Scholar 

  12. Svahnberg, M., Aurum, A., Wohlin, C.: Using students as subjects – an empirical evaluation. In: Proceedings of the International Symposium on Empirical Software Engineering and Measurement, pp. 288–290 (2008)

    Google Scholar 

  13. Runeson, P.: Using students as experiment subjects - an analysis on graduate and freshmen PSP student data. In: Proceedings of the International Conference on Evaluation and Assessment in Software Engineering, pp. 95–102 (2003)

    Google Scholar 

  14. Tichy, W.: Hints for reviewing empirical work in software engineering. J. Empirical Softw. Eng. 5, 309–312 (2000)

    Article  MathSciNet  Google Scholar 

  15. Wohlin, C., Runeson, P., Höst, M., Ohlsson, M., Regnell, B., Wesslén, A.: Experimentation in Software Engineering. An Introduction. Kluwer Academic Publishers, Boston (2000)

    Book  MATH  Google Scholar 

  16. Neto, A., Conte, T.: A conceptual model to address threats to validity in controlled experiments. In: Proceedings of the 17th International Conference on Evaluation and Assessment in Software Engineering, pp. 82–85 (2013)

    Google Scholar 

  17. Robson, C.: Real World Research, 3rd edn. Wiley, Hoboken (2011)

    Google Scholar 

  18. Campbell, D., Stanley, J.: Experimental and Quasi-Experimental Designs for Research. Houghton Mifflin Company, Boston (1963)

    Google Scholar 

  19. Cook, T., Campbell, D.: Quasi-Experimentation - Design and Analysis Issues for Field Settings. Houghton Mifflin Company, Boston (1979)

    Google Scholar 

  20. Nugroho, A.: Level of detail in UML models and its impact on model comprehension: a controlled experiment. J. Inf. Softw. Technol. 51, 1670–1685 (2009)

    Article  Google Scholar 

  21. Espana, S., Condori-Fernandez, N., Gonzalez, A., Pastor, O.: Evaluating the completeness and granularity of functional requirements specifications: a controlled experiment. In: Proceedings of the International Conference on Requirements Engineering, pp. 161–170 (2009)

    Google Scholar 

  22. Genero, M., Manso, E., Visaggio, A., Canfora, G., Piattini, M.: Building measure-based prediction models for UML class diagram maintainability. J. Empirical Softw. Eng. 12, 517–549 (2007)

    Article  Google Scholar 

  23. Karoulis, A., Stamelos, I., Anglis, L., Pombortsis, A.: Formally assessing an instructional tool: a controlled experiment in software engineering. IEEE Trans. Educ. 48, 133–139 (2005)

    Article  Google Scholar 

  24. Genero, M., Cruz-Lemus, J., Caivano, D., Abrahao, S., Insfran, E., Carsi, J.: Assessing the influence of stereotypes on the comprehension of UML sequence diagrams: a controlled experiment. In: Czarnecki, K., Ober, I., Bruel, J.-M., Uhl, A., Völter, M. (eds.) MODELS 2008. LNCS, vol. 5301, pp. 280–294. Springer, Heidelberg (2008)

    Chapter  Google Scholar 

  25. Arisholm, E., Sjoberg, D., Jorgensen, M.: Assessing the changeability of two object-oriented design alternatives - a controlled experiment. J. Empirical Softw. Eng. 6, 231–277 (2001)

    Article  MATH  Google Scholar 

  26. Runeson, P., Höst, M.: Guidlines for conducting and reporting case study research in software engineering. J. Empirical Softw. Eng. 14, 131–164 (2009)

    Article  Google Scholar 

  27. Carver, J., Jaccheri, L., Morasca, S., Shull, F.: A checklist for integrating student empirical studies with research and teaching goals. J. Empirical Softw. Eng. 15, 35–59 (2009)

    Article  Google Scholar 

  28. Ricca, F., Di Pinta, M., Torchiano, M., Tonella, P., Ceccato, M.: The role of experience and ability in comprehension tasks supported by UML stereotypes. In: Proceedings of the International Conference on Software Engineering, ICSE (2007)

    Google Scholar 

  29. Aceituna, D., Gursimran, W., Do, H., Lee, S.: Model-based requirements verification method: conclusions from two controlled experiments. J. Inf. Softw. Technol. 56, 321–334 (2014)

    Article  Google Scholar 

  30. Berander, P.: Using students as subjects in requirements prioritization. In: Proceedings of the International Symposium on Empirical Software Engineering ISESE 2004, pp. 167–176 (2004)

    Google Scholar 

  31. Daun, M., Salmon, A., Tenbergen, B., Weyer, T., Pohl, K.: Industrial case studies in graduate requirements engineering courses: the impact on student motivation. In: Proceedings of the IEEE Conference on Software Engineering Education and Training, pp. 3–12 (2014)

    Google Scholar 

  32. Miller, J., Wood, M., Roper, M.: Further experiences with scenarios and checklists. J. Empirical Softw. Eng. 3, 37–64 (1998)

    Article  Google Scholar 

  33. Basili, V., Green, S., Laitenberger, O.L.F., Shull, F., Sorumgard, S., Zelkowski, M.: The empirical investigation of perspective-based reading. J. Empirical Softw. Eng. 1, 133–164 (1996)

    Article  Google Scholar 

  34. Porter, A., Votta, L., Basili, V.: Comparing detection methods for software requirement inspection: a replicated experiment. IEEE Trans. Softw. Eng. 21, 563–575 (1994)

    Article  Google Scholar 

  35. Laitenberger, O.L.F., Emam, K., Harbich, T.: An internally replicated quasi-experimental comparison of checklist and perspective-based reading of code documents. IEEE Trans. Softw. Eng. 27, 387–421 (2001)

    Article  Google Scholar 

  36. Berling, T., Runeson, P.: Evaluation of a perspective based review method applied in an industrial setting. IEEE Proc. Softw. 150, 177–184 (2003)

    Article  Google Scholar 

  37. Sabaliauskaite, G., Kusumoto, S., Inoue, K.: Assessing defect detection performance of interacting teams in object-oriented design inspection. J. Inf. Softw. Technol. 46, 875–886 (2004)

    Article  Google Scholar 

  38. Daun, M., Weyer, T., Pohl, K.: Detecting and correcting outdated requirements in function-centered engineering of embedded systems. In: Fricker, S.A., Schneider, K. (eds.) REFSQ 2015. LNCS, vol. 9013, pp. 65–80. Springer, Heidelberg (2015)

    Google Scholar 

  39. Daun, M., Salmon, A., Weyer, T., Pohl, K.: The impact of students’ skills and experiences on empirical results: a controlled experiment with undergraduate and graduate students. In: Proceedings of the 19th International Conference on Evaluation and Assessment in Software Engineering, pp. 29:1–29:6 (2015)

    Google Scholar 

  40. Daun, M., Höfflinger, J., Weyer, T.: Function-centered engineering of embedded systems: evaluating industry needs and possible solutions. In: Proceedings of the International Conference on Evaluation and Assessment of Novel Approaches to Software Engineering (2014)

    Google Scholar 

Download references

Acknowledgements

This research was partly funded by the German Federal Ministry of Education and Research under grant no. 01IS12005C and grant no. 01IS15058C. Thanks to our industrial partners for their support in creating the experiment material used. In particular we thank Peter Heidl, Jens Höfflinger and John MacGregor (Bosch), Frank Houdek (Daimler), and Stefan Beck and Arnaud Boyer (Airbus).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marian Daun .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Daun, M., Salmon, A., Bandyszak, T., Weyer, T. (2016). Common Threats and Mitigation Strategies in Requirements Engineering Experiments with Student Participants. In: Daneva, M., Pastor, O. (eds) Requirements Engineering: Foundation for Software Quality. REFSQ 2016. Lecture Notes in Computer Science(), vol 9619. Springer, Cham. https://doi.org/10.1007/978-3-319-30282-9_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-30282-9_19

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-30281-2

  • Online ISBN: 978-3-319-30282-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics