ABSTRACT
Students undertaking summative and formative assessment are very much aware of the imperative of the outcome and often find themselves in a stressful situation with a high level of required concentration. Usability testing of assessment tool interfaces is hindered by the ability to sufficiently replicate the exam environment or to disrupt, watch or monitor a student while they are undertaking an exam or rely on their memories to reproduce their concerns at a later stage. This research demonstrates how a set of heuristics is adapted and redefined to enable an iterative approach to the improvement of a computer aided online assessment tool. The revised set of heuristics offers a tool for future developers to assist in the development of online assessment interfaces.
- Black, P., Wiliam, D. (1998): Inside the Black Box: Raising Standards Through Classroom Assessment. Phi Delta, Kappan, School of Education, King's College London, Vol. 80, Iss. 2, pp 139--149Google Scholar
- Black, P. and Wiliam, D. (2009), Developing The Theory of Formative Assessment, Educational Assessment, Evaluation and Accountability, Springer Netherlands, Vol. 21 Iss. 1, p 5--31.Google Scholar
- Farrell, G. and Farrell, V. (2012) Online Assessment: Getting to See the Whole Picture with Limited Screen Estate. Measurement and Traditional Assessment. Computer Assisted Assessment Conference Proceedings, Southampton, England, p 110--134.Google Scholar
- Farrell, G. and Leung, Y. (2008) Convergence of Validity for the Results of a Summative Assessment with confidence Measurement and Traditional Assessment. Computer Assisted Assessment Conference Proceedings, Loughborough, England, p 123--134.Google Scholar
- Gardner-Medwin, A. R. (2006), Confidence-Based Marking: Towards Deeper Learning and Better Exams in Bryan, C., Clegg., K. (Eds), Innovative Assessment in Higher Education, Taylor & Francis, London.Google Scholar
- Hattie, J., Timperley, H. (2007), The Power of Feedback. Review of Educational Research, American Educational Research Association, Washington, USA, Vol. 77, Iss. 1, p 81--112.Google Scholar
- Krätzig, G., Arbuthnott, K. (2009), Metacognitive Learning: The Effect of Item-Specific Experience and Age on Metamemory Calibration and Planning, Metacognition and Learning, Vol. 4, Iss. 2, Springer, New York, USA, p 125--144.Google ScholarCross Ref
- Nielsen, J. (1994), Enhancing the Explanatory Power of Usability Heuristics. Proceedings of the SIGCHI conference on Human Factors in Computing Systems: Celebrating Interdependence. Boston, Massachusetts, USA, p 152--158. Google ScholarDigital Library
- Preece, J., Rogers, Y., Sharp, H. (2007), Interaction Design: Beyond Human-Computer Interaction, 2nd Edition. John Wiley and Sons, New York, USA. Google ScholarDigital Library
- Reference withheld (2008)Google Scholar
- Sim, G., Read, J., Holfeild, P. (2008), Heuristics for Evaluating the Usability of CAA Applications, Computer Assisted Assessment Conference Proceedings, Loughborough, England p 283--294.Google Scholar
- Te'eni, D., Carey. J., Zhang. P. (2006), Human Computer Interaction: Developing Effective Organizational Information Systems, John Wiley & Sons, New York, USA, p 1--19. Google ScholarDigital Library
- Torrance, H. (2008), Assessment as Learning? How The Use of Explicit Learning Objectives, Assessment Criteria and Feedback in Post Secondary Education and Training Can Come to Dominate Learning in Hall, K., Murphy, P., Soler, J. (Eds) Pedagogy and Practice: Culture and Identities, Sage, London, UK, Chap 1.Google Scholar
Index Terms
- Application of domain specific heuristics to an innovative computer based assessment strategy
Recommendations
The use of multiple choice tests for formative and summative assessment
ACE '06: Proceedings of the 8th Australasian Conference on Computing Education - Volume 52This paper describes the use of multiple-choice tests as an essential part of the assessment for a third-year undergraduate course in computer science. Multiple-choice tests are yesterday's news - they have been used for student assessment for many years ...
Automated assessment and experiences of teaching programming
This article reports on the design, implementation, and usage of the CourseMarker (formerly known as CourseMaster) courseware Computer Based Assessment (CBA) system at the University of Nottingham. Students use CourseMarker to solve (programming) ...
Evidence Based Design of Heuristics for Computer Assisted Assessment
INTERACT '09: Proceedings of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part IThe use of heuristics for the evaluation of interfaces is a well studied area. Currently there appear to be two main research areas in relation to heuristics: the analysis of methods to improve the effectiveness of heuristic evaluations; and the ...
Comments