Abstract
Educational peer assessment has proven to be a useful approach for providing students timely feedback and allowing them to help and learn from each other. Reviewers are often expected both to provide formative feedback─textual feedback telling the authors where and how to improve the artifact─and peer grading at the same time. Formative feedback is important for the authors because timely and insightful feedback can help them improve their artifacts, and peer grading is important to the teaching staff, as it provides more input to help determine final grades. In a large class or MOOC when the help from teaching staff is limited, formative feedback from their peers is the best help that the authors may receive. To guarantee the quality of the formative feedback and reliability of peer grading, instructors should keep on improving peer-assessment rubrics. In this study we used students’ feedback from the last 3 years in the Expertiza peer-assessment system to analyze the quality of 15 existing rubrics on 61 assignments. A set of patterns on peer-grading reliability and comment length were found and a set of guidelines are given accordingly.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Kulik, J.A., Kulik, C.-L.C.: Timing of feedback and verbal learning. Rev. Educ. Res. 58(1), 79–97 (1988)
Lundstrom, K., Baker, W.: To give is better than to receive: the benefits of peer review to the reviewer’s own writing. J. Second Lang. Writ. 18(1), 30–43 (2009)
Orsmond, P., Merry, S., Reiling, K.: The importance of marking criteria in the use of peer assessment. Assess. Eval. High. Educ. 21(3), 239–250 (1996)
Reddy, Y.M., Andrade, H.: A review of rubric use in higher education. Assess. Eval. High. Educ. 35(4), 435–448 (2009)
Stellmack, M.A., Konheim-Kalkstein, Y.L., Manor, J.E., Massey, A.R., Schmitz, J.A.P.: An assessment of reliability and validity of a rubric for grading APA-style introductions. Teach. Psychol. 36(2), 102–107 (2009)
Allen, D., Tanner, K.: Rubrics: tools for making learning goals and evaluation criteria explicit for both teachers and learners. CBE-Life Sci. Educ. 5(3), 197–203 (2006)
Stevens, D.D., Levi, A.: Introduction to Rubrics: An Assessment Tool to Save Grading Time, Convey Effective Feedback, and Promote Student Learning. Stylus Publishing, LLC, Sterling (2005)
Kubincová, Z., Homola, M., Bejdová, V.: Motivational effect of peer review in blog-based activities. In: Wang, J.-F., Lau, R. (eds.) ICWL 2013. LNCS, vol. 8167, pp. 194–203. Springer, Heidelberg (2013)
Gehringer, E., Ehresman, L., Conger, S.G., Wagle, P.: Reusable learning objects through peer review: the Expertiza approach
Piech, C., Huang, J., Chen, Z., Do, C., Ng, A., Koller, D.: Tuned models of peer assessment in MOOCs. In: 6th International Conference on Educational Data Mining, Memphis, Tennessee, USA (2013)
Song, Y., Hu, Z., Gehringer, E.: Pluggable reputation systems for peer review: a web-service approach. In: 45th IEEE Frontiers in Education Conference, 2015, FIE 2015, El Paso, Texas (2015)
Acknowledgments
This project is sponsored by the National Science Foundation, on grant DUE 1432347.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this paper
Cite this paper
Song, Y., Hu, Z., Gehringer, E.F. (2015). Closing the Circle: Use of Students’ Responses for Peer-Assessment Rubric Improvement. In: Li, F., Klamma, R., Laanpere, M., Zhang, J., Manjón, B., Lau, R. (eds) Advances in Web-Based Learning -- ICWL 2015. ICWL 2015. Lecture Notes in Computer Science(), vol 9412. Springer, Cham. https://doi.org/10.1007/978-3-319-25515-6_3
Download citation
DOI: https://doi.org/10.1007/978-3-319-25515-6_3
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-25514-9
Online ISBN: 978-3-319-25515-6
eBook Packages: Computer ScienceComputer Science (R0)