Skip to main content

Advertisement

Log in

Enriched assessment rubrics: a new medium for enabling teachers to easily assess student’s performance when participating in complex interactive learning scenarios

  • Original Paper
  • Published:
Operational Research Aims and scope Submit manuscript

    We’re sorry, something doesn't seem to be working properly.

    Please try refreshing the page. If that doesn't work, please contact support so we can address the problem.

Abstract

Nowadays, teachers in school practice design and apply complex interactive learning scenarios according to standardised collaborative learning strategies such as jigsaw, TPS, etc. There is a well justified and an immediate need to help teachers to easily assess student’s performance. The assessment process in such complex educational scenarios is not a trivial issue since teachers have to combine data gathered during student’s individual performance along with the data which has been gathered throughout group collaboration-activities. Effective means of facilitating the assessment task is the use of rubrics as well as the interaction analysis indicators. This paper has proposed the idea of enriching the traditional assessment rubrics with indicators that measure the students’ performance in computer supported synchronous or asynchronous collaborative activities. Thus in this paper we propose a new conceptual assessment framework based on how an enriched assessment rubrics can be built. We will show an example of enriched assessment rubrics via a complex interactive learning scenario about the controversial issue of human cloning which the teacher can evaluate efficiently (1) the autonomous and team group performance of students (2) the spectrum of students’ interactions in each phase of the scenario and as a whole. Comments about the usability of this new type of rubrics will be reported as well as ideas for future research plans.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Andrade H (2000) Using rubrics to promote thinking and learning. Educ Leadersh 57(5):13–18

    Google Scholar 

  • Andrade H, Boulay B (2003) The role of self-assessment in learning to write. J Educ Res 97(1):21–34

    Google Scholar 

  • Barros M, Verjedo M (2000) Analysing student interaction processes in order to improve collaboration. The DEGREE approach. Int J Artif Intell Educ 11:221–241

    Google Scholar 

  • Bratitsis T, Dimitrakopoulou A (2005) Data performancing and usage interaction analysis in asynchronous discussions: the DIAS system. In: Proceedings of the 12th international conference on artificial intelligence in education AIED. Workshop on usage analysis in learning systems, Amsterdam, The Netherlands

  • Buzetto-More N (2006) The e-learning and business education paradigm: enhancing education, assessment and accountability. In: Proceedings of the Maryland business education association conference, Ocean City, MD

  • Chan C, van Aalst J (2006) Learning, assessment and collaboration in computer-supported environments. In: Dillenbourg P (Series ed), Strijbos JW, Kirschner PA, Martens RL (Vol eds) Computer-supported collaborative learning. What we know about CSCL: and implementing it in higher education, vol 3. Kluwer/Springer, Boston, pp 87−112

  • Chapman C, King R (2005) Differentiated assessment strategies: one tool doesn’t fit all. Corwin Press, Thousand Oaks

    Google Scholar 

  • Daradoumis T, Martínez Α, Xhafa F (2006) A layered framework for evaluating on-line collaborative learning interactions. Int J Man-Mach Stud 64(7):622–635

    Google Scholar 

  • De Laat M, Lally V, Lipponen L, Simons PRJ (2005) Patterns of interaction in a networked learning community: squaring the circle. http://eprints.soton.ac.uk/17267/. Accessed in Oct 2008

  • Dodge B, Pickette N (2001) Rubrics for web lessons. October 2008 from http://edweb.sdsu.edu/webquest/rubrics/weblessons.htm

  • Gallardo T, Guerrero LA, Collazos C, Pino JA, Ochoa S (2003) Supporting JIGSAW-type collaborative learning. In: Proceedings of the 36th annual Hawaii international conference on system sciences (Hicss’03)—Track 1—vol 1 (06–09 January 2003). IEEE Computer Society, Washington DC, 31.1

  • Goodyear P, Banks S, Hodgson V, McConnell D (2004) Advances in research on networked learning, Chap 5. Kluwer, Dordrecht, pp 91–121

  • Ho C, Swan K (2007) Evaluating online conversation in an asynchronous learning environment: an application of Grice’s cooperative principle. Internet Higher Educ 10:3–14

    Google Scholar 

  • Marcos J, Martınez, Dimitriadis Y (2005) Towards adaptable interaction analysis tools in CSCL. In: Representing and analyzing collaborative interactions, Work-shop at the 12th international conference on artificial intelligence in education, AIED’2005, 18–25 July 2005. Amsterdam, The Netherlands

  • Martínez A, Dimitriadis Y, De La Fuente P (2003) Contributions to analysis of interactions for formative evaluation in CSCL. In: Llamas M, Fernandez MJ, Anido LE (eds) Computers and education: towards of lifelong learning society. Kluwer, The Netherlands, pp 227–238

    Google Scholar 

  • Mazza R, Dimitrova V (2005) Generation of graphical representations of student tracking data in course management systems. In: Proceedings of the 9th international conference on information visualisation (IV’05), Washington DC, USA, pp 253–258

  • Moore G (1989) Three types of interaction. Am J Dist Educ 3(2):1–6

    Google Scholar 

  • Petkov D, Petkova O (2006) Development of scoring rubrics for IS projects as an assessment tool. Issues Inf Sci Inf Technol Educ 3:499–510

    Google Scholar 

  • Petropoulou O, Lazakidou G, Retalis S, Vrasidas C (2007) Analysing interaction behaviour in network supported collaborative learning environments: a holistic approach. Int J Knowl Learn 3(4&5):450–464

    Article  Google Scholar 

  • Petropoulou O, Retalis S, Siassiakos K, Karamouzis S, Kargidis T (2008) Helping educators analyse interactions within networked learning communities: a framework and the AnalyticsTool system. In: 6th international conference on networked learning, 5–7 May 2008, Halkidiki, Greece, pp 317–324

  • Poole DM (2000) Student participation in a discussion-oriented online course: a case study. J Res Comput Educ 3(2):162–177

    Google Scholar 

  • Saltz J, Hiltz R, Turoff M (2004) Student social graphs: visualizing a student’s online social network. In: Proceedings of ACM CSCW04 conference on computer-supported cooperative work 2004. 6–10 November 2004, Chicago, Illinois, USA, pp 596–599

  • Saltz JS, Hiltz SR, Turoff M, Passerini K (2007) Increasing participation in distance learning courses. IEEE Internet Comput 11(3):36–44

    Article  Google Scholar 

  • Soller A, Jermann P, Muehlenbrock M, Martinez A (2004) Designing computational models of collaborative learning interaction: introduction to the workshop. In: Proceedings of the 2nd international workshop on designing computational models of collaborative learning interaction, ITS

  • Spada H, Meier A, Rummel N, Hauser S (2005) A new method to assess the quality of collaborative process in CSCL. In: Proceedings of the CSCL 2005, Taiwan, pp 622–631

  • Swan K, Shen J, Hiltz SR (2006) Assessment and collaboration in online learning. J Asynchron Learn Netw 10(1):45–62

    Google Scholar 

  • Zinn C, Scheuer O (2006) Getting to know your student in distance learning contexts. In: Innovative approaches for learning and knowledge sharing, Proceedings of the first European conference on technology enhanced learning (EC-TEL 2006), Lecture notes in computer science (LNCS 4227). Springer, Berlin, pp 437–451

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Symeon Retalis.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Petropoulou, O., Vassilikopoulou, M. & Retalis, S. Enriched assessment rubrics: a new medium for enabling teachers to easily assess student’s performance when participating in complex interactive learning scenarios. Oper Res Int J 11, 171–186 (2011). https://doi.org/10.1007/s12351-009-0047-5

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12351-009-0047-5

Keywords

Navigation