skip to main content
10.1145/3501712.3529742acmconferencesArticle/Chapter ViewAbstractPublication PagesidcConference Proceedingsconference-collections
research-article
Honorable Mention Short Paper

Using video analysis and learning analytics to understand programming trajectories in data science activities with Scratch

Authors Info & Claims
Published:27 June 2022Publication History

ABSTRACT

In this paper, we describe a new automated tool to analyze how students create their projects on Scratch 3.0, with the goal of understanding learning trajectories in a way that considers students’ programming processes and practices, moving beyond the analysis of computational thinking concepts as evidence of learning. Drawing on a combination of qualitative video analysis and temporal learning analytics, we also present preliminary data from a pilot study that illustrates some possibilities afforded by this type of analytical tool. We expect that our tool can help researchers to better understand learning in the context of data visualization activities with block-based programming languages by shedding light on processes that are usually invisible and, thus, better support students in their diverse learning pathways.

References

  1. Australian Curriculum Assessment and Reporting Authority [ACARA]. 2015. Foundation–Year 10 Australian Curriculum: Science. Retrieved January 4, 2022 from https://www.australiancurriculum.edu.au/f-10-curriculum/science/Google ScholarGoogle Scholar
  2. ACM, Code.org, CSTA, Cyber Innovation Center, and National Math and Science Initiative. 2016. K-12 Computer Science Framework. Retrieved from http://www.k12cs.orgGoogle ScholarGoogle Scholar
  3. Golnaz Arastoopour Irgens, Sugat Dabholkar, Connor Bain, Philip Woods, Kevin Hall, Hillary Swanson, Michael Horn, and Uri Wilensky. 2020. Modeling and Measuring High School Students’ Computational Thinking Practices in Science. Journal of Science Education and Technology 29, 1: 137–161. https://doi.org/10.1007/s10956-020-09811-1Google ScholarGoogle ScholarCross RefCross Ref
  4. Satabdi Basu, Daisy Rutstein, Linda Shear, and Yuning Xu. 2020. A principled approach to designing a computational thinking practices assessment for early grades. SIGCSE 2020 - Proceedings of the 51st ACM Technical Symposium on Computer Science Education: 912–918. https://doi.org/10.1145/3328778.3366849Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Matthew Berland, Ryan S. Baker, and Paulo Blikstein. 2014. Educational data mining and learning analytics: Applications to constructionist research. Technology, Knowledge and Learning 19, 1–2: 205–220. https://doi.org/10.1007/s10758-014-9223-7Google ScholarGoogle ScholarCross RefCross Ref
  6. Paulo Blikstein. 2011. Using learning analytics to assess students’ behavior in open-ended programming tasks. In Proceedings of the 1st International Conference on Learning Analytics and Knowledge - LAK ’11, 110. https://doi.org/10.1145/2090116.2090132Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Paulo Blikstein, Marcelo Worsley, Chris Piech, Mehran Sahami, Steven Cooper, and Daphne Koller. 2014. Programming Pluralism: Using Learning Analytics to Detect Patterns in the Learning of Computer Programming. Journal of the Learning Sciences. https://doi.org/10.1080/10508406.2014.954750Google ScholarGoogle Scholar
  8. Bryce Boe, Charlotte Hill, Michelle Len, Greg Dreschler, Phillip Conrad, and Diana Franklin. 2013. Hairball: Lint-inspired static analysis of scratch projects. SIGCSE 2013 - Proceedings of the 44th ACM Technical Symposium on Computer Science Education: 215–220.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Sarah Brasiel, Kevin Close, Soojeong Jeong, and Kevin Lawanto. 2017. Measuring Computational Thinking Development with the FUN! Tool. Emerging Research, Practice, and Policy on Computational Thinking, November. https://doi.org/10.1007/978-3-319-52691-1Google ScholarGoogle Scholar
  10. Karen Brennan and Mitchel Resnick. 2012. New frameworks for studying and assessing the development of computational thinking. Annual American Educational Research Association meeting, Vancouver, BC, Canada.Google ScholarGoogle Scholar
  11. Bodong Chen, Simon Knight, and Alyssa Friend Wise. 2018. Critical Issues in Designing and Implementing Temporal Analytics. Journal of Learning Analytics 5, 1: 1–9. https://doi.org/10.18608/jla.2018.53.1Google ScholarGoogle ScholarCross RefCross Ref
  12. Steve Cooper, Shuchi Grover, Mark Guzdial, and Beth Simon. 2014. A future for computing education research. Communications of the ACM 57, 11: 34–36. https://doi.org/10.1145/2668899Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. CSTA. 2017. CSTA K–12 Computer Science Standards - Revised 2017. Retrieved from https://www.csteachers.org/page/standardsGoogle ScholarGoogle Scholar
  14. British Columbia Ministry of Education. 2017. British Columbia's Science curriculum. Retrieved April 1, 2022 from https://curriculum.gov.bc.ca/curriculum/scienceGoogle ScholarGoogle Scholar
  15. Deborah A. Fields, Lisa Quirke, Janell Amely, and Jason Maughan. 2016. Combining big data and thick data analyses for understanding youth learning trajectories in a summer coding camp. SIGCSE 2016 - Proceedings of the 47th ACM Technical Symposium on Computing Science Education, February 2016: 150–155. https://doi.org/10.1145/2839509.2844631Google ScholarGoogle ScholarDigital LibraryDigital Library
  16. Daniel Amo Filvà, Marc Alier Forment, Francisco José García-Peñalvo, David Fonseca Escudero, and María José Casañ. 2019. Clickstream for learning analytics to assess students’ behavior with Scratch. Future Generation Computer Systems 93: 673–686. https://doi.org/10.1016/j.future.2018.10.057Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. Shuchi Grover. 2017. Assessing Algorithmic and Computational Thinking in K-12: Lessons from a Middle School Classroom. In Emerging Research, Practice, and Policy on Computational Thinking, Peter J Rich and Charles B Hodges (eds.). Springer International Publishing, Cham, 269–288. https://doi.org/10.1007/978-3-319-52691-1_17Google ScholarGoogle Scholar
  18. Shuchi Grover, Satabdi Basu, Marie Bienkowski, Michael Eagle, Nicholas Diana, and John Stamper. 2017. A framework for using hypothesis-driven approaches to support data-driven learning analytics in measuring computational thinking in block-based programming environments. ACM Transactions on Computing Education 17, 3: 1–25. https://doi.org/10.1145/3105910Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Shuchi Grover and Roy Pea. 2013. Computational Thinking in K-12: A Review of the State of the Field. Educational Researcher 42, 1: 38–43. https://doi.org/10.3102/0013189X12463051Google ScholarGoogle ScholarCross RefCross Ref
  20. Simon Knight and Alyssa Friend Wise. 2017. Time for Change: Why Learning Analytics Needs Temporal Analysis. Journal of Learning Analytics 4: 7–17.Google ScholarGoogle ScholarCross RefCross Ref
  21. Taylor Martin, Sarah Brasiel, Soojeong Jeong, Kevin Close, Kevin Lawanto, and Phil Janisciewcz. 2016. Macro data for micro learning: Developing the FUN! Tool for automated assessment of learning. L@S 2016 - Proceedings of the 3rd 2016 ACM Conference on Learning at Scale, April: 233–236. https://doi.org/10.1145/2876034.2893422Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Jesús Moreno-León and Gregorio Robles. 2015. Analyze your Scratch projects with Dr. Scratch and assess your computational thinking skills. Scratch Conference: 12–15. Retrieved from http://jemole.me/replication/2015scratch/InferCT.pdfGoogle ScholarGoogle Scholar
  23. National Research Council. 2012. A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas. The National Academies Press, Washington, DC.Google ScholarGoogle Scholar
  24. Katarina Pantic, Deborah A. Fields, and Lisa Quirke. 2016. Studying situated learning in a constructionist programming camp. In IDC ’16, June 21-24, 428–439. https://doi.org/10.1145/2930674.2930725Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Mitchel Resnick, John Maloney, Andrés Monroy-Hernández, Natalie Rusk, Evelyn Eastmond, Karen Brennan, Amon Millner, Eric Rosenbaum, J a Y Silver, Brian Silverman, and Yasmin Kafai. 2009. Scratch: Programming for All. Communications of the ACM 52: 60–67.Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Mitchel Resnick and Eric Rosenbaum. 2013. Designing for Tinkerability. In Design, Make, Play: Growing the next generation of STEM innovators. Routledge, New York, 163–181.Google ScholarGoogle Scholar
  27. Marcos Román-González, Juan Carlos Pérez-González, and Carmen Jiménez-Fernández. 2017. Which cognitive abilities underlie computational thinking? Criterion validity of the Computational Thinking Test. Computers in Human Behavior 72: 678–691. https://doi.org/10.1016/J.CHB.2016.08.047Google ScholarGoogle ScholarDigital LibraryDigital Library
  28. Eric Snow, Carol Tate, Daisy Rutstein, and Marie Bienkowski. 2017. Assessment design patterns for computational thinking practices in exploring computer science. SRI Education, December: 1–46. Retrieved from http://pact.sri.com/downloads/Assessment-Design-Patterns-for-Computational Thinking-Practices-Secondary-Computer-Science.pdfGoogle ScholarGoogle Scholar
  29. Xiaodan Tang, Yue Yin, Qiao Lin, Roxana Hadad, and Xiaoming Zhai. 2020. Assessing computational thinking: A systematic review of empirical studies. Computers and Education 148, Mc 147. https://doi.org/10.1016/j.compedu.2019.103798Google ScholarGoogle ScholarDigital LibraryDigital Library
  30. Ángela Vargas-Alba, Giovanni Maria Troiano, Quinyu Chen, Casper Harteveld, and Gregorio Robles. 2019. Bad smells in scratch projects: A preliminary analysis. CEUR Workshop Proceedings 2434, Tackle.Google ScholarGoogle Scholar
  31. Jorge Luis Nachtigall Vaz Junior, Tiago Thompsen Primo, and Ana Marilza Pernas. 2020. ScratchAnalytics: Um Framework para Coleta e Análise de Interações em Projetos Scratch. Cbie: 862–871. https://doi.org/10.5753/cbie.sbie.2020.862Google ScholarGoogle Scholar
  32. David Weintrop, Elham Beheshti, Michael Horn, Kai Orton, Kemi Jona, Laura Trouille, and Uri Wilensky. 2016. Defining Computational Thinking for Mathematics and Science Classrooms. Journal of Science Education and Technology 25, 1: 127–147. https://doi.org/10.1007/s10956-015-9581-5Google ScholarGoogle ScholarCross RefCross Ref

Recommendations

Comments

Login options

Check if you have access through your login credentials or your institution to get full access on this article.

Sign in
  • Published in

    cover image ACM Conferences
    IDC '22: Proceedings of the 21st Annual ACM Interaction Design and Children Conference
    June 2022
    718 pages
    ISBN:9781450391979
    DOI:10.1145/3501712

    Copyright © 2022 ACM

    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    • Published: 27 June 2022

    Permissions

    Request permissions about this article.

    Request Permissions

    Check for updates

    Qualifiers

    • research-article
    • Research
    • Refereed limited

    Acceptance Rates

    Overall Acceptance Rate172of578submissions,30%

    Upcoming Conference

    IDC '24
    Interaction Design and Children
    June 17 - 20, 2024
    Delft , Netherlands

PDF Format

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format .

View HTML Format