skip to main content
10.1145/2462476.2465586acmconferencesArticle/Chapter ViewAbstractPublication PagesiticseConference Proceedingsconference-collections
research-article

Assessment of programming: pedagogical foundations of exams

Published:01 July 2013Publication History

ABSTRACT

Previous studies of assessment of programming via written examination have focused on analysis of the examination papers and the questions they contain. This paper reports the results of a study that investigated how these final exam papers are developed, how students are prepared for these exams, and what pedagogical foundations underlie the exams. The study involved interviews of 11 programming lecturers. From our analysis of the interviews, we find that most exams are based on existing formulas that are believed to work; that the lecturers tend to trust in the validity of their exams for summative assessment; and that while there is variation in the approaches taken to writing the exams, all of the exam writers take a fairly standard approach to preparing their students to sit the exam. We found little evidence of explicit references to learning theories or models, indicating that the process is based largely on intuition and experience.

References

  1. Balch, W., "Practice versus review exams and final exam performance," Teaching of Psychology, vol. 25, pp. 181--184, 1998.Google ScholarGoogle ScholarCross RefCross Ref
  2. Bennedsen, J. and Caspersen, M., "Assessing process and product: a practical lab exam for an introductory programming course," in 36th ASEE/IEEE Frontiers in Education Conference, San Diego, USA, 2006.Google ScholarGoogle Scholar
  3. Bergqvist, E., "Types of reasoning required in university exams in mathematics," Journal of Mathematical Behavior, vol. 26, pp. 348--370, 2007.Google ScholarGoogle ScholarCross RefCross Ref
  4. Biggs, J., "Enhancing teaching through contructive alignment," Higher Education, vol. 32, pp. 347--364, 1996.Google ScholarGoogle ScholarCross RefCross Ref
  5. Biggs, J., "Aligning teaching and assessing to course objectives," presented at the Teaching and Learning in Higher Education: New Trends and Innovation, University of Aveiro, 2003.Google ScholarGoogle Scholar
  6. Elliott Tew, A. and Guzdial, M., "Developing a validated assessment of fundamental CS1 concepts," in SIGCSE'10, Milwaukee, Wisconsin, USA, 2010, pp. 97--101. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Haghighi, P. D. and Sheard, J., "Summative computer programming assessment using both computer and paper.," in International Conference on Computers in Education (ICCE 2005), Singapore, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Hamer, J., Cutts, Q., Jackova, J., Luxton-Reilly, A., McCartney, R., Purchase, H., Reidsel, C., Saeli, M., Sanders, K., and Sheard, J., "Contributing Student Pedagogy," inroads - SIGCSE Bulletin, vol. 40, pp. 194--212, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Higgins, C., Gray, G., Symeonidis, P., and Tsintsifas, A., "Automated assessment and experiences of teaching programming," ACM Journal of Educational Resources in Computing, vol. 5, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Morrison, B., Clancy, M., McCartney, R., Richards, B., and Sanders, K., "Applying data structures in exams," in 42nd ACM Technical Symposium on Computer Science Education (SIGCSE'11), Dallas, Texas, USA, 2011, pp. 353--358. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Petersen, A., Craig, M., and Zingaro, D., "Reviewing CS1 exam question content," in 42nd ACM Technical Symposium on Computer Science Education (SIGCSE'11), Dallas, Texas, USA, 2011, pp. 631--636. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. Ramsden, P., Learning to Teach in Higher Education. London: Routledge, 1996.Google ScholarGoogle Scholar
  13. Sheard, J., Simon, Carbone, A., Chinn, D., Clear, T., Corney, M., D'Souza, D., Fenwick, J., Harland, J., Laakso, M., and Teague, D., "How difficult are exams? A framework for assessing the complexity of introductory programming exams," in Australasian Computing Education conference, Adelaide, Australia, 2013, p. (forthcoming).Google ScholarGoogle Scholar
  14. Shuhidan, S., Hamilton, M., and D'Souza, D., "Instructor perspectives of multiple-choice questions in summative assessment for novice programmers," Computer Science Education, vol. 20, pp. 229--259, 2010.Google ScholarGoogle ScholarCross RefCross Ref
  15. Simon, Sheard, J., Carbone, A., Chinn, D., Laakso, M.-J., Clear, T., de Raadt, M., D'Souza, D., Lister, R., Philpott, A., Skene, J., and Warburton, G., "Introductory programming: Examining the exams," in 14th Australasian Computing Education conference, Melbourne, Australia, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Simon, Sheard, J., Carbone, A., D'Souza, D., Harland, J., and Laakso, M.-J., "Can computing academics assess the difficulty of programming examination questions?," in 11th Koli Calling International Conference on Computing Education Research, Finland, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Simon, B., Clancy, M., McCartney, R., Morrison, B., Richards, B., and Sanders, K., "Making sense of data structure exams," in Sixth International Computing Education Research workshop (ICER 2010), Aarhus, Denmark, 2010, pp. 97--105. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Assessment of programming: pedagogical foundations of exams

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ITiCSE '13: Proceedings of the 18th ACM conference on Innovation and technology in computer science education
      July 2013
      384 pages
      ISBN:9781450320788
      DOI:10.1145/2462476

      Copyright © 2013 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 1 July 2013

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      ITiCSE '13 Paper Acceptance Rate51of161submissions,32%Overall Acceptance Rate552of1,613submissions,34%

      Upcoming Conference

      ITiCSE 2024

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader