Abstract:
In order to examine the reliability of peer assessment settings for the evaluation of programming skills using peer assessment and the item response theory (IRT) techniqu...Show MoreMetadata
Abstract:
In order to examine the reliability of peer assessment settings for the evaluation of programming skills using peer assessment and the item response theory (IRT) technique in a small classes, the optimal conditions such as the numbers of peers and the number of tasks are investigated using parameters extracted from the surveyed data. The survey data consisted of 31 students whose partial participation consisted of joining three peer assessments out of the 5 sessions during which these took place. Peer rating conditions such as the number of peer raters or tasks are examined using mean expected standard errors. Also, the relationship between instructor's ratings and estimated ability is examined using the IRT model to look at variations in the number of peer raters and tasks. The results provide evidence that a set of guidelines could better organise peer assessment of the evaluation of programmina skills in actual course settings,
Published in: 2024 21st International Conference on Information Technology Based Higher Education and Training (ITHET)
Date of Conference: 06-08 November 2024
Date Added to IEEE Xplore: 16 January 2025
ISBN Information: