skip to main content
10.1145/2414536.2414557acmotherconferencesArticle/Chapter ViewAbstractPublication PagesozchiConference Proceedingsconference-collections
research-article

Application of domain specific heuristics to an innovative computer based assessment strategy

Published:26 November 2012Publication History

ABSTRACT

Students undertaking summative and formative assessment are very much aware of the imperative of the outcome and often find themselves in a stressful situation with a high level of required concentration. Usability testing of assessment tool interfaces is hindered by the ability to sufficiently replicate the exam environment or to disrupt, watch or monitor a student while they are undertaking an exam or rely on their memories to reproduce their concerns at a later stage. This research demonstrates how a set of heuristics is adapted and redefined to enable an iterative approach to the improvement of a computer aided online assessment tool. The revised set of heuristics offers a tool for future developers to assist in the development of online assessment interfaces.

References

  1. Black, P., Wiliam, D. (1998): Inside the Black Box: Raising Standards Through Classroom Assessment. Phi Delta, Kappan, School of Education, King's College London, Vol. 80, Iss. 2, pp 139--149Google ScholarGoogle Scholar
  2. Black, P. and Wiliam, D. (2009), Developing The Theory of Formative Assessment, Educational Assessment, Evaluation and Accountability, Springer Netherlands, Vol. 21 Iss. 1, p 5--31.Google ScholarGoogle Scholar
  3. Farrell, G. and Farrell, V. (2012) Online Assessment: Getting to See the Whole Picture with Limited Screen Estate. Measurement and Traditional Assessment. Computer Assisted Assessment Conference Proceedings, Southampton, England, p 110--134.Google ScholarGoogle Scholar
  4. Farrell, G. and Leung, Y. (2008) Convergence of Validity for the Results of a Summative Assessment with confidence Measurement and Traditional Assessment. Computer Assisted Assessment Conference Proceedings, Loughborough, England, p 123--134.Google ScholarGoogle Scholar
  5. Gardner-Medwin, A. R. (2006), Confidence-Based Marking: Towards Deeper Learning and Better Exams in Bryan, C., Clegg., K. (Eds), Innovative Assessment in Higher Education, Taylor & Francis, London.Google ScholarGoogle Scholar
  6. Hattie, J., Timperley, H. (2007), The Power of Feedback. Review of Educational Research, American Educational Research Association, Washington, USA, Vol. 77, Iss. 1, p 81--112.Google ScholarGoogle Scholar
  7. Krätzig, G., Arbuthnott, K. (2009), Metacognitive Learning: The Effect of Item-Specific Experience and Age on Metamemory Calibration and Planning, Metacognition and Learning, Vol. 4, Iss. 2, Springer, New York, USA, p 125--144.Google ScholarGoogle ScholarCross RefCross Ref
  8. Nielsen, J. (1994), Enhancing the Explanatory Power of Usability Heuristics. Proceedings of the SIGCHI conference on Human Factors in Computing Systems: Celebrating Interdependence. Boston, Massachusetts, USA, p 152--158. Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Preece, J., Rogers, Y., Sharp, H. (2007), Interaction Design: Beyond Human-Computer Interaction, 2nd Edition. John Wiley and Sons, New York, USA. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. Reference withheld (2008)Google ScholarGoogle Scholar
  11. Sim, G., Read, J., Holfeild, P. (2008), Heuristics for Evaluating the Usability of CAA Applications, Computer Assisted Assessment Conference Proceedings, Loughborough, England p 283--294.Google ScholarGoogle Scholar
  12. Te'eni, D., Carey. J., Zhang. P. (2006), Human Computer Interaction: Developing Effective Organizational Information Systems, John Wiley & Sons, New York, USA, p 1--19. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. Torrance, H. (2008), Assessment as Learning? How The Use of Explicit Learning Objectives, Assessment Criteria and Feedback in Post Secondary Education and Training Can Come to Dominate Learning in Hall, K., Murphy, P., Soler, J. (Eds) Pedagogy and Practice: Culture and Identities, Sage, London, UK, Chap 1.Google ScholarGoogle Scholar

Index Terms

  1. Application of domain specific heuristics to an innovative computer based assessment strategy

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      OzCHI '12: Proceedings of the 24th Australian Computer-Human Interaction Conference
      November 2012
      692 pages

      Copyright © 2012 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 26 November 2012

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate362of729submissions,50%
    • Article Metrics

      • Downloads (Last 12 months)0
      • Downloads (Last 6 weeks)0

      Other Metrics

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader