ABSTRACT
Recent research suggests that one-third of the students enrolled in CS1 courses typically end up failing [4]. Several studies have demonstrated how learning tools can assist struggling students [3]. We present an online, open-source, practice programming tool. BitFit was developed to (1) provide students with an environment to practice weekly material and receive support when needed; and (2) collect student usage data as students progress through programming exercises [2]. We present the core features of BitFit, as well as our qualitative and quantitative analysis of 652 students over three semesters of our CS1 course.
Our findings support recent studies that suggest that at-risk students can be identified as early as two weeks into the semester [1]; the metric used in our study was able to identify over 29% of the students who ended up failing with very high certainty. Our results also reveal that interaction patterns with BitFit, in particular with hint features requested by students, allow the identification of another 52% of students who eventually fail. Throughout the semester, students who failed the course used hint features four times as often as top students, while only attempting to compile code one third as often. The combination of early indicators and interaction patterns identify 81% of students who failed the course during our study.
Students were asked to reflect on their study habits and report on course progress in monthly surveys. Although workflow patterns on a per-question basis were very different between unsuccessful and successful students, students from both groups believed BitFit was effective in helping them learning the course material. Unfortunately, many students who ended up failing chose to learn course concepts by reading through hints and sample solutions. It appears that many of these students believed that memorizing a solution in BitFit was a more productive strategy than solving each problem and programming a solution on their own.
The quantitative data collected by BitFit suggests that there are identifiable differences in workflow patterns between unsuccessful and successful students. Qualitative data collected through student surveys suggests that unsuccessful students do not know their study habits are unlikely to lead to success. We are currently exploring intervention strategies to guide at-risk students towards more productive ways of learning course material.
Using a mixed-method approach combining qualitative BitFit data with quantitative survey data, we plan to continue to better analyze and understand where, when, and how our students exhibit ineffective study habits. We look forward to feedback and discussion about possible intervention techniques, and how to best evaluate the effectiveness of interventions focused on improving student study and learning habits using interactive learning environments
In summary, the main contributions of this work are to (1) illustrate the differences between successful and unsuccessful student workflow patterns based on log data; (2) highlight the differences in survey responses between unsuccessful and successful students about the perceived effectiveness of their own study habits; and (3) overview a number of intervention techniques to potentially incorporate into upcoming course offerings. We believe that as we continue to learn more about the reasons certain students exhibit ineffective study habits, our efforts to support at-risk students will increasingly result in student success.
- A. Ahadi, R. Lister, H. Haapala, and A. Vihavainen. Exploring machine learning methods to automatically identify students in need of assistance. In Proceedings of the Eleventh Annual International Conference on International Computing Education Research, ICER '15, New York, NY, USA, 2015. ACM. Google ScholarDigital Library
- A. Estey, A. Russo Kennedy, and Y. Coady. Bitfit: If you build it, they will come! In Proceedings of the 21st Western Canadian Conference on Computing Education, WCCCE '16, New York, NY, USA, 2016. ACM. Google ScholarDigital Library
- A. Papancea, J. Spacco, and D. Hovemeyer. An open platform for managing short programming exercises. In Proceedings of the Ninth Annual International ACM Conference on International Computing Education Research, ICER '13, New York, NY, USA, 2013. ACM. Google ScholarDigital Library
- C. Watson and F. W. Li. Failure rates in introductory programming revisited. In Proceedings of the 2014 Conference on Innovation & Technology in Computer Science Education, ITiCSE '14, New York, NY, USA, 2014. ACM. Google ScholarDigital Library
Index Terms
- How Can We Improve Student Workflow Practices to Better Enable Student Success in CS1?
Recommendations
Can Interaction Patterns with Supplemental Study Tools Predict Outcomes in CS1?
ITiCSE '16: Proceedings of the 2016 ACM Conference on Innovation and Technology in Computer Science EducationRecent research suggests that one-third of the students enrolled in CS1 courses typically end up failing. Several studies have demonstrated how learning tools can assist struggling students. This work presents the evolution of a practice tool co-...
Applying CS0/CS1 Student Success Factors and Outcomes to Biggs' 3P Educational Model
SIGCSE 2024: Proceedings of the 55th ACM Technical Symposium on Computer Science Education V. 1Over the past decades, computer science education (CSEd) research has studied the multitude of factors that may impact student success in introductory programming courses (CS0/CS1). The lack of foundational structure behind how these factors interrelate ...
Comments