skip to main content
10.1145/3545947.3576246acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
poster

Automated Structural Evaluation of Block-based Coding Assignments

Published:06 March 2023Publication History

ABSTRACT

As computer science is integrated into a wider variety of fields, block-based programming languages like Snap!, which assemble code with visual blocks rather than text syntax, are increasingly used to teach computational thinking (CT) to students from diverse backgrounds. Although automated evaluators (autograders) for programming assignments usually focus on runtime efficiency and output accuracy, effective evaluation of a student's CT skills requires assessing coding best practices, such as decomposition, abstraction, and algorithm design. While autograders are commonplace for text languages like Python, we present a machine learning approach to assess how effectively block-based code demonstrates understanding of CT fundamentals. Our dataset consists of Snap! programs written by students new to coding and evaluated by instructors using a CT rubric. We explore how to best transform these programs into low-dimensional features to allow encapsulation and repetition patterns to emerge. Experimentation involves comparing the effectiveness of a suite of clustering models and similarity metrics by analyzing how directly automated feedback correlates to the course staff's manual evaluation. Lastly, we demonstrate the practical application of the autograder in a classroom setting and discuss scalability and feasibility in other domains of CS education.

Skip Supplemental Material Section

Supplemental Material

SIGCSE23-V2pp0847.mp4

mp4

141.2 MB

References

  1. Andrea Forte and Mark Guzdial. 2004. Computers for Communication, Not Calculation: Media as a Motivation and Context for Learning. In Proceedings of the 37th Annual Hawaii International Conference on System Sciences. Big Island, Hawaii. https://doi.org/10.1109/HICSS.2004.1265259Google ScholarGoogle ScholarCross RefCross Ref
  2. Rahul Gupta, Aditya Kanade, and Shirish Shevade. 2019. Deep Reinforcement Learning for Syntactic Error Repair in Student Programs. In Proceedings of the AAAI Conference on Artificial Intelligence, Vol. 33. AAAI Press, Honolulu, Hawaii, 930--937. https://doi.org/10.1609/aaai.v33i01.3301930Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Chris Piech, Jonathan Huang, Andy Nguyen, Mike Phulsuksombati, Mehran Sahami, and Leonidas Guibas. 2015. Learning Program Embeddings to Propagate Feedback on Student Code. In Proceedings of the 32nd International Conference on Machine Learning (W&CP, Vol. 37 ). JMLR, Lille, France, 248--253. https://doi.org/10.48550/arXiv.1505.05969Google ScholarGoogle Scholar

Index Terms

  1. Automated Structural Evaluation of Block-based Coding Assignments

              Recommendations

              Comments

              Login options

              Check if you have access through your login credentials or your institution to get full access on this article.

              Sign in
              • Published in

                cover image ACM Conferences
                SIGCSE 2023: Proceedings of the 54th ACM Technical Symposium on Computer Science Education V. 2
                March 2023
                1481 pages
                ISBN:9781450394338
                DOI:10.1145/3545947

                Copyright © 2022 Owner/Author

                Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

                Publisher

                Association for Computing Machinery

                New York, NY, United States

                Publication History

                • Published: 6 March 2023

                Check for updates

                Qualifiers

                • poster

                Acceptance Rates

                Overall Acceptance Rate1,595of4,542submissions,35%

                Upcoming Conference

                SIGCSE Virtual 2024
              • Article Metrics

                • Downloads (Last 12 months)25
                • Downloads (Last 6 weeks)4

                Other Metrics

              PDF Format

              View or Download as a PDF file.

              PDF

              eReader

              View online with eReader.

              eReader