skip to main content
10.1145/3545945.3569724acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article

Executable Exams: Taxonomy, Implementation and Prospects

Published:03 March 2023Publication History

ABSTRACT

Traditionally exams in introductory programming courses have tended to be multiple choice, or "paper-based" coding exams in which students hand write code. This does not reflect how students typically write and are assessed on programming assignments in which they write code on a computer and are able to validate and assess their code using an auto-grading system.

Executable exams are exams in which students are given programming problems, write code using a computer within a development environment and submissions are digitally validated or executed. This format is far more consistent with how students engage in programming assignments.

This paper explores the executable exam format and attempts to gauge the state-of-the-practice and how prevalent it is. First, we formulate a taxonomy of characteristics of executable exams, identifying common aspects and various levels of flexibility. then give two case studies: one in which executable exams have been utilized for nearly 10 years and another in which they've been recently adopted. Finally, we provide results from faculty surveys providing evidence that, though not standard practice, the use of executable exams is not uncommon and appears to be on the rise.

References

  1. Jo ao Paulo Barros, Luís Estevens, Rui Dias, Rui Pais, and Elisabete Soeiro. 2003. Using Lab Exams to Ensure Programming Practice in an Introductory Programming Course. SIGCSE Bull., Vol. 35, 3 (June 2003), 16--20. https://doi.org/10.1145/961290.961519Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Chris Bourke. 2015. CSE Webgrader. https://github.com/cbourke/grade/. https://github.com/cbourke/grade/Google ScholarGoogle Scholar
  3. Chris Bourke. 2021. Development & Evolution of a Computer Science I Course. Technical Report. University of Nebraska-Lincoln. https://digitalcommons.unl.edu/prtunl/202/ Available, https://digitalcommons.unl.edu/prtunl/202/.Google ScholarGoogle Scholar
  4. Binglin Chen, Matthew West, and Craig Zilles. 2017. Do Performance Trends Suggest Wide-Spread Collaborative Cheating on Asynchronous Exams?. In Proceedings of the Fourth (2017) ACM Conference on Learning @ Scale (Cambridge, Massachusetts, USA) (L@S '17). Association for Computing Machinery, New York, NY, USA, 111--120. https://doi.org/10.1145/3051457.3051465Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Jonathan Corley, Ana Stanescu, Lewis Baumstark, and Michael C. Orsega. 2020. Paper Or IDE? The Impact of Exam Format on Student Performance in a CS1 Course. Association for Computing Machinery, New York, NY, USA, 706--712. https://doi.org/10.1145/3328778.3366857Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Christopher Douce, David Livingstone, and James Orwell. 2005. Automatic test-based assessment of programming: A review. ACM J. Educ. Resour. Comput., Vol. 5 (2005), 4.Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Yael Erez and Orit Hazzan. 2021. 10 Tips for Implementing Executable Exams. Communications of the ACM BLOG@CACM. https://cacm.acm.org/blogs/blog-cacm/253802-10-tips-for-implementing-executable-exams/fulltextGoogle ScholarGoogle Scholar
  8. Max Fowler, David H. Smith, Chinedu Emeka, Matthew West, and Craig Zilles. 2022. Are We Fair? Quantifying Score Impacts of Computer Science Exams with Randomized Question Pools. In Proceedings of the 53rd ACM Technical Symposium on Computer Science Education V. 1 (Providence, RI, USA) (SIGCSE 2022). Association for Computing Machinery, New York, NY, USA, 647--653. https://doi.org/10.1145/3478431.3499388Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Chelsea Gordon, Roman Lysecky, and Frank Vahid. 2020. The rise of the zyLab program auto-grader in introductory CS courses. Technical Report. zyBooks. https://docs.google.com/document/d/e/2PACX-1vQYwxlY738_9zFFwOer1kKTNGuJx1Qe3IDW8XHf_OOYbaq9Drf_a9ljCqjcHY9Vv4ryPK423W7FmHwZ/pubUpdated March 2022.Google ScholarGoogle Scholar
  10. Scott Grissom, Laurie Murphy, Renée McCauley, and Sue Fitzgerald. 2016. Paper vs. Computer-Based Exams: A Study of Errors in Recursive Binary Tree Algorithms. In Proceedings of the 47th ACM Technical Symposium on Computing Science Education (Memphis, Tennessee, USA) (SIGCSE '16). Association for Computing Machinery, New York, NY, USA, 6--11. https://doi.org/10.1145/2839509.2844587Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Amanda M. Holland-Minkley and Thomas Lombardi. 2016. Improving Engagement in Introductory Courses with Homework Resubmission. In Proceedings of the 47th ACM Technical Symposium on Computing Science Education (Memphis, Tennessee, USA) (SIGCSE '16). Association for Computing Machinery, New York, NY, USA, 534--539. https://doi.org/10.1145/2839509.2844576Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Amruth N. Kumar. 2010. Closed Labs in Computer Science I Revisited in the Context of Online Testing. In Proceedings of the 41st ACM Technical Symposium on Computer Science Education (Milwaukee, Wisconsin, USA) (SIGCSE '10). Association for Computing Machinery, New York, NY, USA, 539--543. https://doi.org/10.1145/1734263.1734443Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Oka Kurniawan, Norman Tiong Seng Lee, and Christopher M. Poskitt. 2020. Securing Bring-Your-Own-Device (BYOD) Programming Exams. Association for Computing Machinery, New York, NY, USA, 880--886. https://doi.org/10.1145/3328778.3366907Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Vesa Lappalainen, Antti-Jussi Lakanen, and Harri Högmander. 2016. Paper-Based vs Computer-Based Exams in CS1. In Proceedings of the 16th Koli Calling International Conference on Computing Education Research (Koli, Finland) (Koli Calling '16). Association for Computing Machinery, New York, NY, USA, 172--173. https://doi.org/10.1145/2999541.2999565Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. Jérémie Lumbroso and James Evans. 2020. Making Manual Code Review Scale. In Proceedings of the 51st ACM Technical Symposium on Computer Science Education (Portland, OR, USA) (SIGCSE '20). Association for Computing Machinery, New York, NY, USA, 1390. https://doi.org/10.1145/3328778.3367026Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Hannah E. Murphy. 2017. Digitalizing Paper-Based Exams: An Assessment of Programming Grading Assistant. In Proceedings of the 2017 ACM SIGCSE Technical Symposium on Computer Science Education (Seattle, Washington, USA) (SIGCSE '17). Association for Computing Machinery, New York, NY, USA, 775--776. https://doi.org/10.1145/3017680.3022448Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Chris Piech and Chris Gregg. 2018. BlueBook: A Computerized Replacement for Paper Tests in Computer Science. In Proceedings of the 49th ACM Technical Symposium on Computer Science Education (Baltimore, Maryland, USA) (SIGCSE '18). Association for Computing Machinery, New York, NY, USA, 562--567. https://doi.org/10.1145/3159450.3159587Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. Felipe Restrepo-Calle, Jhon J. Ramírez Echeverry, and Fabio A. González. 2019. Continuous assessment in a computer programming course supported by a software tool. Computer Applications in Engineering Education, Vol. 27, 1 (Jan. 2019), 80--89. https://doi.org/10.1002/cae.22058Google ScholarGoogle ScholarCross RefCross Ref
  19. Saul Schleimer, Daniel S. Wilkerson, and Alex Aiken. 2003. Winnowing: Local Algorithms for Document Fingerprinting. In Proceedings of the 2003 ACM SIGMOD International Conference on Management of Data (San Diego, California) (SIGMOD '03). Association for Computing Machinery, New York, NY, USA, 76--85. https://doi.org/10.1145/872757.872770Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Matthew West, Geoffrey L. Herman, and Craig Zilles. 2015. PrairieLearn: Mastery-based Online Problem Solving with Adaptive Scoring and Recommendations Driven by Machine Learning. In 2015 ASEE Annual Conference & Exposition. ASEE Conferences, Seattle, Washington, 26.1238.1 -- 26.1238.14. https://peer.asee.org/24575.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Executable Exams: Taxonomy, Implementation and Prospects

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      SIGCSE 2023: Proceedings of the 54th ACM Technical Symposium on Computer Science Education V. 1
      March 2023
      1481 pages
      ISBN:9781450394314
      DOI:10.1145/3545945

      Copyright © 2023 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 3 March 2023

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate1,595of4,542submissions,35%

      Upcoming Conference

      SIGCSE Virtual 2024
      SIGCSE Virtual 2024: ACM Virtual Global Computing Education Conference
      November 30 - December 1, 2024
      Virtual Event , USA

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader