skip to main content
10.1145/3430895.3460159acmotherconferencesArticle/Chapter ViewAbstractPublication Pagesl-at-sConference Proceedingsconference-collections
Work in Progress

Exploring Design Choices in Data-driven Hints for Python Programming Homework

Published: 08 June 2021 Publication History

Abstract

Students often struggle during programming homework and may need help getting started or localizing errors. One promising and scalable solution is to provide automated programming hints, generated from prior student data, which suggest how a student can edit their code to get closer to a solution, but little work has explored how to design these hints for large-scale, real-world classroom settings, or evaluated such designs. In this paper, we present CodeChecker, a system which generates hints automatically using student data, and incorporates them into an existing CS1 online homework environment, used by over 1000 students per semester. We present insights from survey and interview data, about student and instructor perceptions of the system. Our results highlight affordances and limitations of automated hints, and suggest how specific design choices may have impacted their effectiveness.

Supplementary Material

MP4 File (L-at-S21-lswp100.mp4)
Students often struggle during programming homework and may need help getting started or localizing errors. One promising and scalable solution is to provide automated programming hints, generated from prior student data, which suggest how a student can edit their code to get closer to a solution, but little work has explored how to design these hints for large-scale, real-world classroom settings, or evaluated such designs. In this video, we present CodeChecker, a system which generates hints automatically using student data, and incorporates them into an existing CS1 online homework environment, used by over 1000 students per semester. We present insights from survey and interview data, about student and instructor perceptions of the system. Our results highlight affordances and limitations of automated hints, and suggest how specific design choices may have impacted their effectiveness.

References

[1]
Thomas W. Price, Rui Zhi, and Tiffany Barnes. 2017a. Evaluation of a Data-driven Feedback Algorithm for Open-ended Programming. In Proceedings of the International Conference on Educational Data Mining.
[2]
Thomas W. Price, Rui Zhi, and Tiffany Barnes. 2017b. Hint Generation Under Uncertainty: The Effect of Hint Quality on Help-Seeking Behavior. In Proceedings of the International Conference on Artificial Intelligence in Education.
[3]
Kelly Rivers and Kenneth R. Koedinger. 2017. Data-Driven Hint Generation in Vast Solution Spaces: a Self-Improving Python Programming Tutor. International Journal of Artificial Intelligence in Education 27, 1 (2017), 37--64. http://link.springer.com/10.1007/s40593-015-0070-z

Cited By

View all
  • (2022)Adaptive Immediate Feedback for Block-Based Programming: Design and EvaluationIEEE Transactions on Learning Technologies10.1109/TLT.2022.318098415:3(406-420)Online publication date: 1-Jun-2022

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
L@S '21: Proceedings of the Eighth ACM Conference on Learning @ Scale
June 2021
380 pages
ISBN:9781450382151
DOI:10.1145/3430895
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 08 June 2021

Check for updates

Author Tags

  1. automated programming hints
  2. computing education

Qualifiers

  • Work in progress

Conference

L@S '21
L@S '21: Eighth (2021) ACM Conference on Learning @ Scale
June 22 - 25, 2021
Virtual Event, Germany

Acceptance Rates

Overall Acceptance Rate 117 of 440 submissions, 27%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)14
  • Downloads (Last 6 weeks)0
Reflects downloads up to 20 Feb 2025

Other Metrics

Citations

Cited By

View all
  • (2022)Adaptive Immediate Feedback for Block-Based Programming: Design and EvaluationIEEE Transactions on Learning Technologies10.1109/TLT.2022.318098415:3(406-420)Online publication date: 1-Jun-2022

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media