Abstract
As an assessment method based on a social constructivist approach, peer assessment has become popular in recent years. When the number of learners increases as in MOOCs, peer assessment is often conducted by dividing learners into multiple groups to reduce the learner’s assessment workload. However, in this case, a difficulty remains that the assessment accuracies of learners in each group depends on the assigned rater. To solve that problem, this study proposes a group optimization method to maximize peer assessment accuracy based on item response theory using integer programming. Experimental results, however, showed that the proposed method does not necessarily present higher accuracy than a random group formation. Therefore, we further propose an external rater selection method that assigns a few outside-group raters to each learner. Simulation and actual data experiments demonstrate that introduction of external raters using the proposed method improves the peer assessment accuracy considerably.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Ueno, M., Okamoto, T.: Item response theory for peer assessment. In: Proceedings of IEEE International Conference on Advanced Learning Technologies, pp. 554–558(2008)
Uto, M., Ueno, M.: Item response theory for peer assessment. IEEE Trans. Learn. Technol. 9(2), 157–170 (2016)
Davies, P.: Review in computerized peer-assessment. Will it affect student marking consistency? In: Proceedings of 11th CAA International Computer Assisted Conference, pp. 143–151(2007)
Lan, C.H., Graf, S., Lai, K.R., Kinshuk, K.: Enrichment of peer assessment with agent negotiation. IEEE Trans. Learn. Technol. 4(1), 35–46 (2011)
Cho, K., Schunn, C.D.: Scaffolded writing and rewriting in the discipline: a web-based reciprocal peer review system. Comput. Educ. 48(3), 409–426 (2007)
Topping, K.J., Smith, E.F., Swanson, I., Elliot, A.: Formative peer assessment of academic writing between postgraduate students. Assess. Eval. High. Educ. 25(2), 149–169 (2000)
Moccozet, L., Tardy, C.: An assessment for learning framework with peer assessment of group works. In: Proceedings of International Conference on Information Technology Based Higher Education and Training, pp. 1–5 (2015)
Staubitz, T., Petrick, D., Bauer, M., Renz, J., Meinel, C.: Improving the peer assessment experience on mooc platforms. In: Proceeings of Third ACM Conference on Learning at Scale, New York, NY, USA 389–398 (2016)
ArchMiller, A., Fieberg, J., Walker, J., Holm, N.: Group peer assessment for summative evaluation in a graduate-level statistics course for ecologists. Assess. Eval. High. Educ. 1–13 (2016)
Suen, H.: Peer assessment for massive open online courses (MOOCs). Int. Rev. Res. Open Distrib. Learn. 15(3), 313–327 (2014)
Shah, N.B., Bradley, J., Balakrishnan, S., Parekh, A., Ramchandran, K., Wainwright, M.J.: Some scaling laws for MOOC assessments. In: ACM KDD Workshop on Data Mining for Educational Assessment and Feedback (2014)
Lave, J., Wenger, E.: Situated Learning. Legitimate Peripheral Participation. Cambridge University Press, New York, Port Chester, Melbourne, Sydney (1991)
Eckes, T.: Introduction to Many-Facet Rasch Measurement: Analyzing and Evaluating Rater-Mediated Assessments. Peter Lang Pub Inc., Bern (2015)
Lord, F.: Applications of Item Response Theory to Practical Testing Problems. Erlbaum Associates, Mahwah (1980)
Patz, R.J., Junker, B.W., Johnson, M.S., Mariano, L.T.: The hierarchical rater model for rated test items and its application to large-scale educational assessment data. J. Educ. Behav. Stat. 27(4), 341–366 (1999)
Nguyen, T., Uto, M., Abe, Y., Ueno, M.: Reliable peer assessment for team project based learning using item response theory. In: Proceedings of International Conference on Computers in Education, pp. 144–153 (2015)
Pang, Y., Mugno, R., Xue, X., Wang, H.: Constructing collaborative learning groups with maximum diversity requirements. In: 15th IEEE International Conference on Advanced Learning Technologies, pp. 34–38, July 2015
Lin, Y.S., Chang, Y.C., Chu, C.P.: Novel approach to facilitating tradeoff multi-objective grouping optimization. IEEE Trans. Learn. Technol. 9(2), 107–119 (2016)
Ueno, M.: Data mining and text mining technologies for collaborative learning in an ILMS “samurai”. In: Proceedings of IEEE International Conference on Advanced Learning Technologies, pp. 1052–1053 (2004)
Persky, H., Daane, M., Jin, Y.: The nation’s report card: writing 2002. Technical report. National Center for Education Statistics (2003)
Salahu-Din, D., Persky, H., Miller, J.: The nation’s report card: writing 2007. Technical report. National Center for Education Statistics (2008)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this paper
Cite this paper
Uto, M., Thien, N.D., Ueno, M. (2017). Group Optimization to Maximize Peer Assessment Accuracy Using Item Response Theory. In: André, E., Baker, R., Hu, X., Rodrigo, M., du Boulay, B. (eds) Artificial Intelligence in Education. AIED 2017. Lecture Notes in Computer Science(), vol 10331. Springer, Cham. https://doi.org/10.1007/978-3-319-61425-0_33
Download citation
DOI: https://doi.org/10.1007/978-3-319-61425-0_33
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-61424-3
Online ISBN: 978-3-319-61425-0
eBook Packages: Computer ScienceComputer Science (R0)