Analyzing productive learning behaviors for students using immediate corrective feedback in a blended learning environment
Introduction
In educational settings, feedback serves the crucial role of closing the gap between students’ current understanding and desired learning (Shute, 2008). Timely and informative feedback can help learners recognize and correct misconceptions, motivate them to acquire knowledge, and increase their confidence and motivation to learn (Epstein et al., 2010). Yet in the context of higher education, it is often not possible for instructors to provide timely feedback to every student individually. This is especially true in first-year foundational courses due to the large number of students usually enrolled. But feedback delivered on computer-based platforms such as edX (www.edx.org), Coursera (www.coursera.org), and FutureLearn (www.futurelearn.com) offer a potential solution. They can provide students with automatic immediate feedback on formative assessment tasks such as reading questions, homework problems, and short quizzes. These platforms enable the detailed recording of the activities the students perform online. In turn, these data logs can be re-assembled to give researchers an in-depth look into how students are learning the concepts and skills instructors want them to master.
The specific feedback mechanism we focus on in this study is the “checkable answer feature” (CAF) included in a suite of online materials in an introductory blended learning course in physics, which we call PHYS101, at an elite private university in the northeast United States. The CAF, a rudimentary intelligent tutoring system, gives students immediate feedback on whether or not their answers to online homework problems are correct. Once the student enters their answer, if it is correct, the system returns a green tick mark, and if it is wrong, they see a red X (see Fig. 1). In addition, students use the CAF to see if their answers to multi-part problems, which they have to write out, are correct before they submit their traditional, paper-based homework assignments.
It is evident that formative feedback can foster improved achievement and enhanced motivation to learn, while also supporting deep and self-regulated learning (Crisp and Ward, 2008, Koh, 2008, Wolsey, 2008). Studies of courses that are wholly online (Crisp and Ward, 2008, Gijbels et al., 2005, Sorensen and Takle, 2005, Van der Pol et al., 2008, Vonderwell et al., 2007, Wang et al., 2008) show that the effective use of online formative assessment can engage students and teachers in meaningful educational experiences, providing them with opportunities to collaboratively identify learning needs and devise strategies to meet them. Other studies have investigated the effectiveness of online or computer-assisted feedback tools in blended learning environments (e.g., Angus and Watson, 2009, Chung et al., 2006, Feldman and Capobianco, 2008, Lin, 2008, Wang et al., 2008). Foundational work by Van der Kleij, Feskens, and Eggen (2015) analyzing a significant number of studies indicates that elaborated feedback is more effective than simple corrective feedback. This effect is more pronounced in higher level learning and in certain disciplines, as, for example, math versus science. Interestingly, several studies, including recent work by Van der Kleij, do not observe a difference between corrective feedback and elaborated feedback (Mazingo, 2006, Van der Kleij et al., 2012).
This raises the possibility that how students engage with feedback may have a significant influence on how feedback affects learning outcomes. While, there is evidence that the amount of time spent with feedback is mediated by student attitudes and motivations (Narciss and Huth, 2006, Van der Kleij et al., 2012), to our knowledge there are no quantitative studies of how the level and patterns of student engagement with corrective feedback can affect learning outcomes. Our study addresses this gap by addressing the following research questions:
RQ1: How much do students interact with simple corrective feedback in a blended undergraduate physics class, and what are the patterns of engagement?
RQ2: (a) What patterns of engagement with simple corrective feedback comprise productive study behaviors and are associated with stronger performance, and (b) what patterns are negatively correlated with performance?
Computer-based feedback systems are becoming more sophisticated and of growing benefit in higher education. Individual feedback can be more easily scaled to large numbers of learners because each student can interact with information that is instantaneously provided, and the material can be focused on the individual student's cognitive needs, study strategies, and preferences for how material is presented. By combining advances in artificial intelligence with content experts' knowledge of difficult concepts and common misconceptions, the material can also address the stumbling blocks that interfere with learning in particular disciplines. Feedback can be delivered when and where it is most convenient for the student (a dorm room, on-campus common study space, or Starbucks, for that matter). The ever-increasing use of these online educational platforms, coupled with rich, large-scale data gathering tools, enables an unprecedented window into student use of computer-based feedback mechanisms. Recognizing these interactions can help instructors deliver more targeted interventions, as well as help platform developers create the features that can help maximize the potential of these online systems. This study also contributes to understanding how a large, detailed online dataset can reveal patters of student study behaviors. In turn, this demonstrates ways in which data mining can be used in to complement traditional statistics and generate new knowledge. Finally, correlating online feedback behaviors with course performance sheds light on how self-regulated learning and productive struggle contribute to positive student outcomes.
Section snippets
Context of the study
Our specific study was undertaken within the context of a semester-long required residential physics course in mechanics, which all undergraduates at the university must pass to graduate. A well-respected faculty member in the physics department began to experiment with redesigning the course in 2000, moving away from lectures to a model that incorporated active learning pedagogies. (In physics, this model of teaching is known as “studio physics” (Beichner et al., 2007, Beichner and Saul, 2003,
Theoretical framework
This study is grounded both in the research on feedback in learning, which has a long and rich history, and relatively new scholarship on computer-mediated feedback, including comparisons between students’ use of feedback provided by intelligent tutoring systems and online learning platforms. Some of the behaviors we see indicate a subset of the students are using strategies associated with self-regulation, and we utilize that body of work as well. We knit this applicable background literature
Methodology
In order to identify students’ behavioral patterns with the CAF and to understand the relationship between those patterns and performance metrics, we needed to go through several distinct steps, balancing theory with emergent patterns we saw in our data. First, we generated descriptive statics, and then we used inferential statistics and data mining techniques to determine the relationship between student behaviors and their outcomes. This approach allows us to take advantage of traditional
Results
The results of our analyses allow us to begin to answer the research questions that motivate this study. We can identify how students are using immediate feedback in solving online homework problems and what behaviors correlate with a stronger performance in the course.
Discussion
This study utilizes rich quantitative data to understand how students interact with immediate feedback on their online homework assignments in a blended learning course. The feedback is simple and corrective: once the students submit an answer to a problem, the checkable answer feature (CAF) responds with either a green tick mark or a red X to tell them if they are right or wrong. Although a substantial body of research exists on the importance of feedback to the learning process, clickstream
Conclusion
Online platforms can now provide researchers with data that indicate what resources students are using, how they come to understand concepts, and what behaviors reflect effective study strategies. Thus, these platforms allow us to study learning at a deeper level than we have been able to. In this study, we examine how students use very simple online task-level feedback to master material that they may not have encountered before, recognizing that feedback is integral to the learning process.
Acknowledgements
We would like to recognize the contributions of undergraduate researcher Xingyu Zhou (Purdue University) in data cleaning and understanding the data generating mechanism of the platform. We would like to thank the instructors and researchers at the university studied for their participation and help with the ongoing investigation.
We would like to aknowledge the support and feedback from the first author’s doctoral committee members Dr. Ruth Streveler and Dr. Sean Brophy.
This research is based
References (59)
Self-regulated learning: Where we are today
International Journal of Educational Research
(1999)- et al.
The development of a formative scenario-based computer assisted assessment tool in psychology for teachers: The PePCAA project
Computers & Education
(2008) - et al.
Implementing flexible hybrid instruction in an electrical engineering course: The best of three worlds?
Computers & Education
(2015) Refocusing formative feedback to enhance learning in pre-registration nurse education
Nurse Education in Practice
(2008)Preservice teachers' learning experiences of constructing e-portfolios online
The Internet and Higher Education
(2008)- et al.
Effects of a computer-assisted formative assessment intervention based on multiple-tier diagnostic items and different feedback types
Computers & Education
(2016) - et al.
Tracking student behavior, persistence, and achievement in online courses
The Internet and Higher Education
(2005) - et al.
Fostering achievement and motivation with bug-related tutoring feedback in a computer-based training for written subtraction
Learning and Instruction
(2006) - et al.
Detecting and preventing “multiple-account” cheating in massive open online courses
Computers & Education
(2016) - et al.
Effects of feedback in a computer-based assessment for learning
Computers & Education
(2012)