ABSTRACT
Previous studies of assessment of programming via written examination have focused on analysis of the examination papers and the questions they contain. This paper reports the results of a study that investigated how these final exam papers are developed, how students are prepared for these exams, and what pedagogical foundations underlie the exams. The study involved interviews of 11 programming lecturers. From our analysis of the interviews, we find that most exams are based on existing formulas that are believed to work; that the lecturers tend to trust in the validity of their exams for summative assessment; and that while there is variation in the approaches taken to writing the exams, all of the exam writers take a fairly standard approach to preparing their students to sit the exam. We found little evidence of explicit references to learning theories or models, indicating that the process is based largely on intuition and experience.
- Balch, W., "Practice versus review exams and final exam performance," Teaching of Psychology, vol. 25, pp. 181--184, 1998.Google ScholarCross Ref
- Bennedsen, J. and Caspersen, M., "Assessing process and product: a practical lab exam for an introductory programming course," in 36th ASEE/IEEE Frontiers in Education Conference, San Diego, USA, 2006.Google Scholar
- Bergqvist, E., "Types of reasoning required in university exams in mathematics," Journal of Mathematical Behavior, vol. 26, pp. 348--370, 2007.Google ScholarCross Ref
- Biggs, J., "Enhancing teaching through contructive alignment," Higher Education, vol. 32, pp. 347--364, 1996.Google ScholarCross Ref
- Biggs, J., "Aligning teaching and assessing to course objectives," presented at the Teaching and Learning in Higher Education: New Trends and Innovation, University of Aveiro, 2003.Google Scholar
- Elliott Tew, A. and Guzdial, M., "Developing a validated assessment of fundamental CS1 concepts," in SIGCSE'10, Milwaukee, Wisconsin, USA, 2010, pp. 97--101. Google ScholarDigital Library
- Haghighi, P. D. and Sheard, J., "Summative computer programming assessment using both computer and paper.," in International Conference on Computers in Education (ICCE 2005), Singapore, 2005. Google ScholarDigital Library
- Hamer, J., Cutts, Q., Jackova, J., Luxton-Reilly, A., McCartney, R., Purchase, H., Reidsel, C., Saeli, M., Sanders, K., and Sheard, J., "Contributing Student Pedagogy," inroads - SIGCSE Bulletin, vol. 40, pp. 194--212, 2008. Google ScholarDigital Library
- Higgins, C., Gray, G., Symeonidis, P., and Tsintsifas, A., "Automated assessment and experiences of teaching programming," ACM Journal of Educational Resources in Computing, vol. 5, 2005. Google ScholarDigital Library
- Morrison, B., Clancy, M., McCartney, R., Richards, B., and Sanders, K., "Applying data structures in exams," in 42nd ACM Technical Symposium on Computer Science Education (SIGCSE'11), Dallas, Texas, USA, 2011, pp. 353--358. Google ScholarDigital Library
- Petersen, A., Craig, M., and Zingaro, D., "Reviewing CS1 exam question content," in 42nd ACM Technical Symposium on Computer Science Education (SIGCSE'11), Dallas, Texas, USA, 2011, pp. 631--636. Google ScholarDigital Library
- Ramsden, P., Learning to Teach in Higher Education. London: Routledge, 1996.Google Scholar
- Sheard, J., Simon, Carbone, A., Chinn, D., Clear, T., Corney, M., D'Souza, D., Fenwick, J., Harland, J., Laakso, M., and Teague, D., "How difficult are exams? A framework for assessing the complexity of introductory programming exams," in Australasian Computing Education conference, Adelaide, Australia, 2013, p. (forthcoming).Google Scholar
- Shuhidan, S., Hamilton, M., and D'Souza, D., "Instructor perspectives of multiple-choice questions in summative assessment for novice programmers," Computer Science Education, vol. 20, pp. 229--259, 2010.Google ScholarCross Ref
- Simon, Sheard, J., Carbone, A., Chinn, D., Laakso, M.-J., Clear, T., de Raadt, M., D'Souza, D., Lister, R., Philpott, A., Skene, J., and Warburton, G., "Introductory programming: Examining the exams," in 14th Australasian Computing Education conference, Melbourne, Australia, 2012. Google ScholarDigital Library
- Simon, Sheard, J., Carbone, A., D'Souza, D., Harland, J., and Laakso, M.-J., "Can computing academics assess the difficulty of programming examination questions?," in 11th Koli Calling International Conference on Computing Education Research, Finland, 2012. Google ScholarDigital Library
- Simon, B., Clancy, M., McCartney, R., Morrison, B., Richards, B., and Sanders, K., "Making sense of data structure exams," in Sixth International Computing Education Research workshop (ICER 2010), Aarhus, Denmark, 2010, pp. 97--105. Google ScholarDigital Library
Index Terms
- Assessment of programming: pedagogical foundations of exams
Recommendations
How difficult are exams?: a framework for assessing the complexity of introductory programming exams
ACE '13: Proceedings of the Fifteenth Australasian Computing Education Conference - Volume 136Student performance on examinations is influenced by the level of difficulty of the questions. It seems reasonable to propose therefore that assessment of the difficulty of exam questions could be used to gauge the level of skills and knowledge expected ...
Can computing academics assess the difficulty of programming examination questions?
Koli Calling '12: Proceedings of the 12th Koli Calling International Conference on Computing Education ResearchThis paper reports the results of a study investigating how accurately programming teachers can judge the level of difficulty of questions in introductory programming examinations. Having compiled a set of 45 questions that had been used in introductory ...
Developing Assessments to Determine Mastery of Programming Fundamentals
ITiCSE '17: Proceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science EducationCurrent CS1 learning outcomes are relatively general, specifying tasks such as designing, implementing, testing and debugging programs that use some fundamental programming constructs. These outcomes impact what we teach, our expectations, and our ...
Comments