ABSTRACT
In this paper, we describe a new automated tool to analyze how students create their projects on Scratch 3.0, with the goal of understanding learning trajectories in a way that considers students’ programming processes and practices, moving beyond the analysis of computational thinking concepts as evidence of learning. Drawing on a combination of qualitative video analysis and temporal learning analytics, we also present preliminary data from a pilot study that illustrates some possibilities afforded by this type of analytical tool. We expect that our tool can help researchers to better understand learning in the context of data visualization activities with block-based programming languages by shedding light on processes that are usually invisible and, thus, better support students in their diverse learning pathways.
- Australian Curriculum Assessment and Reporting Authority [ACARA]. 2015. Foundation–Year 10 Australian Curriculum: Science. Retrieved January 4, 2022 from https://www.australiancurriculum.edu.au/f-10-curriculum/science/Google Scholar
- ACM, Code.org, CSTA, Cyber Innovation Center, and National Math and Science Initiative. 2016. K-12 Computer Science Framework. Retrieved from http://www.k12cs.orgGoogle Scholar
- Golnaz Arastoopour Irgens, Sugat Dabholkar, Connor Bain, Philip Woods, Kevin Hall, Hillary Swanson, Michael Horn, and Uri Wilensky. 2020. Modeling and Measuring High School Students’ Computational Thinking Practices in Science. Journal of Science Education and Technology 29, 1: 137–161. https://doi.org/10.1007/s10956-020-09811-1Google ScholarCross Ref
- Satabdi Basu, Daisy Rutstein, Linda Shear, and Yuning Xu. 2020. A principled approach to designing a computational thinking practices assessment for early grades. SIGCSE 2020 - Proceedings of the 51st ACM Technical Symposium on Computer Science Education: 912–918. https://doi.org/10.1145/3328778.3366849Google ScholarDigital Library
- Matthew Berland, Ryan S. Baker, and Paulo Blikstein. 2014. Educational data mining and learning analytics: Applications to constructionist research. Technology, Knowledge and Learning 19, 1–2: 205–220. https://doi.org/10.1007/s10758-014-9223-7Google ScholarCross Ref
- Paulo Blikstein. 2011. Using learning analytics to assess students’ behavior in open-ended programming tasks. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge - LAK ’11, 110. https://doi.org/10.1145/2090116.2090132Google ScholarDigital Library
- Paulo Blikstein, Marcelo Worsley, Chris Piech, Mehran Sahami, Steven Cooper, and Daphne Koller. 2014. Programming Pluralism: Using Learning Analytics to Detect Patterns in the Learning of Computer Programming. Journal of the Learning Sciences. https://doi.org/10.1080/10508406.2014.954750Google Scholar
- Bryce Boe, Charlotte Hill, Michelle Len, Greg Dreschler, Phillip Conrad, and Diana Franklin. 2013. Hairball: Lint-inspired static analysis of scratch projects. SIGCSE 2013 - Proceedings of the 44th ACM Technical Symposium on Computer Science Education: 215–220.Google ScholarDigital Library
- Sarah Brasiel, Kevin Close, Soojeong Jeong, and Kevin Lawanto. 2017. Measuring Computational Thinking Development with the FUN! Tool. Emerging Research, Practice, and Policy on Computational Thinking, November. https://doi.org/10.1007/978-3-319-52691-1Google Scholar
- Karen Brennan and Mitchel Resnick. 2012. New frameworks for studying and assessing the development of computational thinking. Annual American Educational Research Association meeting, Vancouver, BC, Canada.Google Scholar
- Bodong Chen, Simon Knight, and Alyssa Friend Wise. 2018. Critical Issues in Designing and Implementing Temporal Analytics. Journal of Learning Analytics 5, 1: 1–9. https://doi.org/10.18608/jla.2018.53.1Google ScholarCross Ref
- Steve Cooper, Shuchi Grover, Mark Guzdial, and Beth Simon. 2014. A future for computing education research. Communications of the ACM 57, 11: 34–36. https://doi.org/10.1145/2668899Google ScholarDigital Library
- CSTA. 2017. CSTA K–12 Computer Science Standards - Revised 2017. Retrieved from https://www.csteachers.org/page/standardsGoogle Scholar
- British Columbia Ministry of Education. 2017. British Columbia's Science curriculum. Retrieved April 1, 2022 from https://curriculum.gov.bc.ca/curriculum/scienceGoogle Scholar
- Deborah A. Fields, Lisa Quirke, Janell Amely, and Jason Maughan. 2016. Combining big data and thick data analyses for understanding youth learning trajectories in a summer coding camp. SIGCSE 2016 - Proceedings of the 47th ACM Technical Symposium on Computing Science Education, February 2016: 150–155. https://doi.org/10.1145/2839509.2844631Google ScholarDigital Library
- Daniel Amo Filvà, Marc Alier Forment, Francisco José García-Peñalvo, David Fonseca Escudero, and María José Casañ. 2019. Clickstream for learning analytics to assess students’ behavior with Scratch. Future Generation Computer Systems 93: 673–686. https://doi.org/10.1016/j.future.2018.10.057Google ScholarDigital Library
- Shuchi Grover. 2017. Assessing Algorithmic and Computational Thinking in K-12: Lessons from a Middle School Classroom. In Emerging Research, Practice, and Policy on Computational Thinking, Peter J Rich and Charles B Hodges (eds.). Springer International Publishing, Cham, 269–288. https://doi.org/10.1007/978-3-319-52691-1_17Google Scholar
- Shuchi Grover, Satabdi Basu, Marie Bienkowski, Michael Eagle, Nicholas Diana, and John Stamper. 2017. A framework for using hypothesis-driven approaches to support data-driven learning analytics in measuring computational thinking in block-based programming environments. ACM Transactions on Computing Education 17, 3: 1–25. https://doi.org/10.1145/3105910Google ScholarDigital Library
- Shuchi Grover and Roy Pea. 2013. Computational Thinking in K-12: A Review of the State of the Field. Educational Researcher 42, 1: 38–43. https://doi.org/10.3102/0013189X12463051Google ScholarCross Ref
- Simon Knight and Alyssa Friend Wise. 2017. Time for Change: Why Learning Analytics Needs Temporal Analysis. Journal of Learning Analytics 4: 7–17.Google ScholarCross Ref
- Taylor Martin, Sarah Brasiel, Soojeong Jeong, Kevin Close, Kevin Lawanto, and Phil Janisciewcz. 2016. Macro data for micro learning: Developing the FUN! Tool for automated assessment of learning. L@S 2016 - Proceedings of the 3rd 2016 ACM Conference on Learning at Scale, April: 233–236. https://doi.org/10.1145/2876034.2893422Google ScholarDigital Library
- Jesús Moreno-León and Gregorio Robles. 2015. Analyze your Scratch projects with Dr. Scratch and assess your computational thinking skills. Scratch Conference: 12–15. Retrieved from http://jemole.me/replication/2015scratch/InferCT.pdfGoogle Scholar
- National Research Council. 2012. A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas. The National Academies Press, Washington, DC.Google Scholar
- Katarina Pantic, Deborah A. Fields, and Lisa Quirke. 2016. Studying situated learning in a constructionist programming camp. In IDC ’16, June 21-24, 428–439. https://doi.org/10.1145/2930674.2930725Google ScholarDigital Library
- Mitchel Resnick, John Maloney, Andrés Monroy-Hernández, Natalie Rusk, Evelyn Eastmond, Karen Brennan, Amon Millner, Eric Rosenbaum, J a Y Silver, Brian Silverman, and Yasmin Kafai. 2009. Scratch: Programming for All. Communications of the ACM 52: 60–67.Google ScholarDigital Library
- Mitchel Resnick and Eric Rosenbaum. 2013. Designing for Tinkerability. In Design, Make, Play: Growing the next generation of STEM innovators. Routledge, New York, 163–181.Google Scholar
- Marcos Román-González, Juan Carlos Pérez-González, and Carmen Jiménez-Fernández. 2017. Which cognitive abilities underlie computational thinking? Criterion validity of the Computational Thinking Test. Computers in Human Behavior 72: 678–691. https://doi.org/10.1016/J.CHB.2016.08.047Google ScholarDigital Library
- Eric Snow, Carol Tate, Daisy Rutstein, and Marie Bienkowski. 2017. Assessment design patterns for computational thinking practices in exploring computer science. SRI Education, December: 1–46. Retrieved from http://pact.sri.com/downloads/Assessment-Design-Patterns-for-Computational Thinking-Practices-Secondary-Computer-Science.pdfGoogle Scholar
- Xiaodan Tang, Yue Yin, Qiao Lin, Roxana Hadad, and Xiaoming Zhai. 2020. Assessing computational thinking: A systematic review of empirical studies. Computers and Education 148, Mc 147. https://doi.org/10.1016/j.compedu.2019.103798Google ScholarDigital Library
- Ángela Vargas-Alba, Giovanni Maria Troiano, Quinyu Chen, Casper Harteveld, and Gregorio Robles. 2019. Bad smells in scratch projects: A preliminary analysis. CEUR Workshop Proceedings 2434, Tackle.Google Scholar
- Jorge Luis Nachtigall Vaz Junior, Tiago Thompsen Primo, and Ana Marilza Pernas. 2020. ScratchAnalytics: Um Framework para Coleta e Análise de Interações em Projetos Scratch. Cbie: 862–871. https://doi.org/10.5753/cbie.sbie.2020.862Google Scholar
- David Weintrop, Elham Beheshti, Michael Horn, Kai Orton, Kemi Jona, Laura Trouille, and Uri Wilensky. 2016. Defining Computational Thinking for Mathematics and Science Classrooms. Journal of Science Education and Technology 25, 1: 127–147. https://doi.org/10.1007/s10956-015-9581-5Google ScholarCross Ref
Recommendations
Integrating the Analytics of Student Interaction Data Within Scratch with a Programming Skills Taxonomy
ICER 2021: Proceedings of the 17th ACM Conference on International Computing Education ResearchK-12 computing teachers need guidance on how to best implement student-centered practices for teaching programming, particularly in block-based programming environments (BBPEs) [10]. One way to provide such guidance to teachers is through Integrated ...
ScratchLog: Live Learning Analytics for Scratch
ITiCSE 2023: Proceedings of the 2023 Conference on Innovation and Technology in Computer Science Education V. 1Scratch is a hugely popular block-based programming environment that is often used in educational settings, and has therefore recently also become a focus for research on programming education. Scratch provides dedicated teacher accounts that make it ...
Clickstream for learning analytics to assess students’ behavior with Scratch
AbstractThe construction of knowledge through computational practice requires to teachers a substantial amount of time and effort to evaluate programming skills, to understand and to glimpse the evolution of the students and finally to state a ...
Highlights- Clickstream technique to collect student interactions in programming environments.
Comments