skip to main content
10.1145/2745802.2745829acmotherconferencesArticle/Chapter ViewAbstractPublication PageseaseConference Proceedingsconference-collections
research-article

The impact of students' skills and experiences on empirical results: a controlled experiment with undergraduate and graduate students

Published:27 April 2015Publication History

ABSTRACT

In empirical software engineering research, graduate students are often seen as legitimate substitutes for industry professionals. It has been also argued in the literature that the generalizability of empirical results from experiments with undergraduate students as participants holds to a much lower extent. In this paper, we report on a controlled experiment conducted separately with graduate students and undergraduate students in order to gain deeper insights whether the results from experiments with graduates and undergraduates in the software engineering field are equal or significantly different with respect to the conclusions that can be drawn. During the experiment, the students apply a specific validation technique for behavioral requirements of embedded software. We observed that graduates were significantly more effective, efficient, and confident in their tasks than the undergraduates. Nevertheless, the experiment with undergraduates also shows significant results, even though with a smaller effect size.

References

  1. Wohlin, C., Runeson, P., Höst, M., Ohlsson, M., Regnell, B., and Wesslén, A. 2000. Experimentation in Software Engineering - An Introduction. Kluwer Academic, Boston. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Berander, P. 2004. Using students as subjects in requirement prioritization. In Proc. Int. Symp. Emp. Softw. Eng., 167--176. Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Runeson, P. 2003. Using students as experiment subjects - an analysis on graduate and freshmen PSP student data. In Proc. EASE, 95--102.Google ScholarGoogle Scholar
  4. Tichy, W. 2000. Hints for reviewing empirical work in software engineering. J. Empir. Softw. Eng 5(4), 309--312. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Daun, M., Weyer, T., and Pohl, K. 2014. Validating the functional design of embedded systems against stakeholder intentions. In Proc. Int. Conf. MDE and Softw. Dev., 333--339.Google ScholarGoogle Scholar
  6. Jedlitschka, A., Ciolkowski, M., Pfahl, D. 2007. Reporting Experiments in Software Engineering. In Guide to Advanced Empirical Software Engineering, Springer, NY, 201--228.Google ScholarGoogle Scholar
  7. Brinkkemper, S. and Pachidi, S. 2010. Functional architecture modeling for the software product industry. In Proc. Europ. Conf. on Softw. Arch., 198--213. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Jantsch, A. and Sander, I. 2000. On the roles of functions and objects in system specification. Proc. Hw/Sw Codesign, 8--12. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Basili, V., Green, S., Laitenberger, O., Shull, F., Sorumgard, S., and Zelkowski, M. 1996. The empirical investigation of perspective-based reading. J. Emp. Sw. Eng. 1(2), 133--164.Google ScholarGoogle ScholarCross RefCross Ref
  10. ITU, Recommendation Z.120, 2011.Google ScholarGoogle Scholar
  11. Daun, M., Höfflinger, J. and Weyer, T. 2014. Function-centered engineering of embedded systems: evaluating industry needs and possible solutions. In Proc. ENASE, 226--234.Google ScholarGoogle Scholar
  12. Miller, J., Wood, M., and Roper, M. 1998. Further experiences with scenarios and checklists. J. ESE, 3(1), 37--64. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. He, L. and Carver, J. 2006. PBR vs. checklist: a replication in the n-fold inspection context. In Proc. Int. Symp. Empir. Softw. Eng., 95--104. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Maldonado, J., Carver, J., Shull, F., Fabbri, S., Dória, E., Martimiano, L., Mendonca, M., Basili, V. 2006. Perspective-based reading: a replicated experiment focused on individual reviewer effectiveness. J. Emp. Softw. Eng 11(1), 119--142. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Porter, A., Votta L., and Basili, V. 1994. Comparing detection methods for software requirement inspection: a replicated experiment. IEEE Trans. Softw. Eng. 21(6), 563--575. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Porter, A. and Votta, L. 1998. Comparing detection methods for software requirements inspection: a replication using professional subjects. J. Empir. Softw. Eng. 3(4), 355--379. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Laitenberger, O., Emam, K., and Harbich, T. 2001. An internally replicated quasi-experimental comparison of checklist and perspective-based reading of code documents. IEEE Trans. Softw. Eng. 27(5), 387--421. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Berling, T. and Runeson, P. 2003. Evaluation of a perspective based review method applied in an industrial setting. IEEE Proc.-Softw. 150(3), 177--184.Google ScholarGoogle ScholarCross RefCross Ref
  19. Sabaliauskaite, G., Kusumoto, S., Inoue, K. 2004. Assessing defect detection performance of interacting teams in object-oriented design inspection. J Inf. Softw Tech 46(13), 875--886Google ScholarGoogle ScholarCross RefCross Ref
  20. Robbins, B. and Carver, J. 2009. Cognitive factors in perspective-based reading (PBR): a protocol analysis study. In Proc. Int. Symp. Empir. Softw. Eng. Measurement, 145--155. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. Regnell, B., Runeson, P., and Thelin, T. 2000. Are the perspectives really different? - Further experimentation on scenario-based reading of requirements. J. ESE 5(4), 331--356. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Höst, M., Regness, B., and Wohlin, C. 2000. Using students as subjects - a comparative study of students and professionals in lead time impact assessment. J. ESE 5(3), 201--214. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Svahnberg, M., Aurum, A., and Wohlin, C. 2008. Using students as subjects -- an empirical evaluation. In Proc Int. Symp. Empir. Softw. Eng. Measurement, 288--290. Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Venkatesh, V. and Bala, H. 2008. Technology acceptance model 3 and a research agenda on interventions. In J. Decision Sci. 39(2), 273--315.Google ScholarGoogle ScholarCross RefCross Ref
  25. Faul, F., Erdfelder, E., Buchner, A. and Lang, A. 2009. Statistical power analyses using G*Power 3.1. In J. Behav. Res. Methods 41, 1149--1160.Google ScholarGoogle ScholarCross RefCross Ref
  26. Sjoberg, D. I. K., Anda, B., Arisholm, E., Dyba, T., Jorgensen, M., Karahasanovic, A., Koren, E. F., and Vokác, M. 2002. Conducting Realistic Experiments in Software Engineering. In Proc. Int. Symp. Emp. Softw. Eng, 17--26. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. The impact of students' skills and experiences on empirical results: a controlled experiment with undergraduate and graduate students

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Other conferences
          EASE '15: Proceedings of the 19th International Conference on Evaluation and Assessment in Software Engineering
          April 2015
          305 pages
          ISBN:9781450333504
          DOI:10.1145/2745802

          Copyright © 2015 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 27 April 2015

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          EASE '15 Paper Acceptance Rate20of65submissions,31%Overall Acceptance Rate71of232submissions,31%

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader