Abstract
Systematic endeavors to take computer science (CS) and computational thinking (CT) to scale in middle and high school classrooms are underway with curricula that emphasize the enactment of authentic CT skills, especially in the context of programming in block-based programming environments. There is, therefore, a growing need to measure students’ learning of CT in the context of programming and also support all learners through this process of learning computational problem solving. The goal of this research is to explore hypothesis-driven approaches that can be combined with data-driven ones to better interpret student actions and processes in log data captured from block-based programming environments with the goal of measuring and assessing students’ CT skills. Informed by past literature and based on our empirical work examining a dataset from the use of the Fairy Assessment in the Alice programming environment in middle schools, we present a framework that formalizes a process where a hypothesis-driven approach informed by Evidence-Centered Design effectively complements data-driven learning analytics in interpreting students’ programming process and assessing CT in block-based programming environments. We apply the framework to the design of Alice tasks for high school CS to be used for measuring CT during programming.
- Saleema Amershi and Cristina Conati. 2009. Combining unsupervised and supervised classification to build user models for exploratory learning environments. Journal of Educational Data Mining 1, 1 (2009), 18--71.Google Scholar
- Ryan S. J. D. Baker and Paul Salvador Inventado. 2014. Educational data mining and learning analytics. In Cambridge Handbook of the Learning Sciences, K. Sawyer (Ed.). Cambridge University Press. Google ScholarCross Ref
- Ryan S. J. D. Baker and Kalina Yacef. 2009. The state of educational data mining in 2009: A review and future visions. Journal of Educational Data Mining 1, 1 (2009), 3--17.Google Scholar
- Satabdi Basu, Gautam Biswas, and John S. Kinnebrew. 2017. Learner modeling for adaptive scaffolding in a computational thinking-based science learning environment. User Modeling and User-Adapted Interaction 27, 1 (March 2017), 5--53. DOI:https://doi.org/10.1007/s11257-017-9187-0Google ScholarDigital Library
- Matthew Berland, Taylor Martin, Tom Benton, Carmen Petrick Smith, and Don Davis. 2013. Using learning analytics to understand the learning pathways of novice programmers. Journal of the Learning Sciences 22, 4 (September 2013), 564--599. DOI:https://doi.org/10.1080/10508406.2013.836655Google ScholarCross Ref
- Marie Bienkowski, Eric B. Snow, Daisy Wise Rutstein, and Shuchi Grover. 2015. Assessment Design Patterns for Computational Thinking Practices: A First Look. Menlo Park, CA: SRI International.Google Scholar
- Paulo Blikstein, Marcelo Worsley, Chris Piech, Mehran Sahami, Steven Cooper, and Daphne Koller. 2014. Programming pluralism: Using learning analytics to detect patterns in the learning of computer programming. Journal of the Learning Sciences 23, 4 (October 2014), 561--599. DOI:https://doi.org/10.1080/10508406.2014.954750Google ScholarCross Ref
- Steve Cooper, Shuchi Grover, Mark Guzdial, and Beth Simon. 2014. A future for computing education research. Communications of the ACM 57, 11 (October 2014), 34--36. DOI:https://doi.org/10.1145/2668899Google ScholarDigital Library
- Wanda P. Dann, Stephen Cooper, and Randy Pausch. 2009. Learning to Program with Alice. 2nd ed. Upper Saddle River, NJ: Prentice Hall.Google Scholar
- Jill Denner and Linda Werner. 2009. The Development of Computational Thinking Among Middle School Students Creating Computer Games. NSF Award DRL-0909733.Google Scholar
- Jill Denner, Linda Werner, and Eloy Ortiz. 2012. Computer games created by middle school girls: Can they be used to measure understanding of computer science concepts? Computers 8 Education 58, 1 (January 2012), 240--249. DOI:https://doi.org/10.1016/j.compedu.2011.08.006Google Scholar
- Michael Eagle, Nicholas Diana, John C. Stamper, Shuchi Grover, and Marie Bienkowski. (in review). Deriving insight into problem solving behavior in interactive open-ended programming environments from transactional data. Transactions on Computing Education.Google Scholar
- Deborah A. Fields, Michael Giang, and Yasmin Kafai. 2014. Programming in the wild: Trends in youth computational participation in the online scratch community. In Proceedings of the 9th Workshop in Primary and Secondary Computing Education (WiPSCE’14). New York: ACM, 2--11. DOI:https://doi.org/10.1145/2670757.2670768Google ScholarDigital Library
- Deborah A. Fields, Lisa Quirke, Janell Amely, and Jason Maughan. 2016. Combining big data and thick data analyses for understanding youth learning trajectories in a summer coding camp. In Proceedings of the 47th ACM Technical Symposium on Computing Science Education (SIGCSE’16). New York: ACM, 150--155. DOI:https://doi.org/10.1145/2839509.2844631Google ScholarDigital Library
- Jesse Fox, Sun Joo (Grace) Ahn, Joris H. Janssen, Leo Yeykelis, Kathryn Y. Segovia, and Jeremy N. Bailenson. 2015. Avatars versus agents: A meta-analysis quantifying the effect of agency on social influence. Human--Computer Interaction 30, 5 (September 2015), 401--432. DOI:https://doi.org/10.1080/07370024.2014.921494Google ScholarDigital Library
- Alexis Gabadinho, Gilbert Ritschard, Nicholas S. Müller, and Matthias Studer. 2011. Analyzing and visualizing state sequences in r with traminer. Journal of Statistical Software 40, 4 (2011), 1--37. DOI:https://doi.org/10.18637/jss.v040.i04Google ScholarCross Ref
- Janice D. Gobert, Michael A. Sao Pedro, Ryan S. J. d. Baker, Ermal Toto, and Orlando Montalvo. 2012. Leveraging educational data mining for real-time performance assessment of scientific inquiry skills within microworlds. Journal of Educational Data Mining 4, 1 (October 2012), 153--185.Google Scholar
- Joanna Goode. 2008. Increasing diversity in K-12 computer science: Strategies from the field. In Proceedings of the 39th SIGCSE Technical Symposium on Computer Science Education (SIGCSE’08). Portland, OR: ACM, 362--366. DOI:https://doi.org/DOI=http://dx.doi.org/10.1145/1352135.1352259 Google ScholarDigital Library
- Joanna Goode and Gail Chapman. 2015. Exploring Computer Science, version 6. (2015). Retrieved from http://www.exploringcs.org/curriculum.Google Scholar
- Google Inc. and Gallup Inc. 2016. Diversity Gaps in Computer Science: Exploring the Underrepresentation of Girls, Blacks and Hispanics.Google Scholar
- Ralph F. Grove. 1998. Using the personal software process to motivate good programming practices. In Proceedings of the 6th Annual Conference on the Teaching of Computing and the 3rd Annual Conference on Integrating Technology into Computer Science Education: Changing the Delivery of Computer Science Education (ITiCSE’98). New York: ACM, 98--101. DOI:https://doi.org/10.1145/282991.283046Google Scholar
- Shuchi Grover. 2017. Assessing algorithmic and computational thinking in K-12: Lessons from a middle school classroom. In Emerging Research, Practice, and Policy on Computational Thinking, Peter J. Rich and Charles B. Hodges (Eds.). Educational Communications and Technology: Issues and Innovations. Berlin: Springer International Publishing, 269--288. DOI:https://doi.org/10.1007/978-3-319-52691-1_17Google Scholar
- Shuchi Grover, Marie Bienkowski, John Niekrasz, and Matthias Hauswirth. 2016. Assessing problem-solving process at scale. In Proceedings of the 3rd ACM Conference on Learning @ Scale (L@S’16). New York: ACM, 245--248. DOI:https://doi.org/10.1145/2876034.2893425Google ScholarDigital Library
- Shuchi Grover, Stephen Cooper, and Roy Pea. 2014. Assessing computational learning in K-12. In Proceedings of the 2014 Conference on Innovation 8 Technology in Computer Science Education. ACM, 57--62. DOI:https://doi.org/10.1145/2591708.2591713Google ScholarDigital Library
- Shuchi Grover and Roy Pea. 2013. Computational thinking in k--12 a review of the state of the field. Educational Researcher 42, 1 (January 2013), 38--43. DOI:https://doi.org/10.3102/0013189X12463051Google ScholarCross Ref
- Shuchi Grover, Roy Pea, and Stephen Cooper. 2015. Designing for deeper learning in a blended computer science course for middle school students. Computer Science Education 25, 2 (April 2015), 199--237. DOI:https://doi.org/10.1080/08993408.2015.1033142Google ScholarCross Ref
- Shuchi Grover, Roy Pea, and Stephen Cooper. 2014. Remedying misperceptions of computer science among middle school students. In Proceedings of the 45th ACM Technical Symposium on Computer Science Education (SIGCSE’14). New York: ACM, 343--348. DOI:https://doi.org/10.1145/2538862.2538934Google ScholarDigital Library
- Kilem L. Gwet. 2014. Handbook of Inter-Rater Reliability: The Definitive Guide to Measuring the Extent of Agreement Among Raters. 4th ed. Advanced Analytics.Google Scholar
- Kathleen C. Haynie, Geneva D. Haertel, Andrea A. Lash, Edys S. Quellmalz, and Angela Haydel DeBarger. 2006. Reverse Engineering the NAEP Floating Pencil Task Using the PADI Design System. Menlo Park, CA: SRI International.Google Scholar
- Will Jernigan, Amber Horvath, Michael Lee, Margaret Burnett, Taylor Cuilty, Sandeep Kuttal, Anicia Peters, Irwin Kwan, Faezeh Bahmani, and Andrew Ko. 2015. A principled evaluation for a principled idea garden. In 2015 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC’15). 235--243. DOI:https://doi.org/10.1109/VLHCC.2015.7357222Google ScholarCross Ref
- Saj-Nicole A. Joni and Elliot Soloway. 1986. But my program runs! Discourse rules for novice programmers. Journal of Educational Computing Research 2, 1 (February 1986), 95--125. DOI:https://doi.org/10.2190/6E5W-AR7C-NX76-HUT2Google ScholarCross Ref
- K-12 Computer Science Framework. (2016). Retrieved from http://www.k12cs.org.Google Scholar
- Caitlin Kelleher. 2006. Motivating Programming: Using Storytelling to Make Computer Programming Attractive to Middle School Girls. Carnegie Mellon University.Google ScholarDigital Library
- Deidre Kerr and G. Chung. 2012. Identifying key features of student performance in educational video games and simulations through cluster analysis. Journal of Educational Data Mining 4, 1 (2012), 144--182.Google Scholar
- Kenneth R. Koedinger, Albert T. Corbett, and Charles Perfetti. 2012. The knowledge-learning-instruction framework: Bridging the science-practice chasm to enhance robust student learning. Cognitive Science 36, 5 (April 2012), 757--798. DOI:https://doi.org/10.1111/j.1551-6709.2012.01245.xGoogle ScholarCross Ref
- Kyu Han Koh, Ashok Basawapatna, Vicki Bennett, and Alexander Repenning. 2010. Towards the automatic recognition of computational thinking for adaptive visual language learning. In Proceedings of the 2010 IEEE Symposium on Visual Languages and Human-Centric Computing. 59--66. DOI:https://doi.org/10.1109/VLHCC.2010.17Google ScholarDigital Library
- D. Midian Kurland and Roy D. Pea. 1985. Children's mental models of recursive logo programs. Journal of Educational Computing Research 1, 2 (May 1985), 235--243. DOI:https://doi.org/10.2190/JV9Y-5PD0-MX22-9J4YGoogle Scholar
- Marcia C. Linn, Douglas Clark, and James D. Slotta. 2003. WISE design for knowledge integration. Science Education 87, 4 (July 2003), 517--538. DOI:https://doi.org/10.1002/sce.10086Google ScholarCross Ref
- Orni Meerbaum-Salant, Michal Armoni, and Mordechai (Moti) Ben-Ari. 2013. Learning computer science concepts with Scratch. Computer Science Education 23, 3 (September 2013), 239--264. DOI:https://doi.org/10.1080/08993408.2013.832022Google ScholarCross Ref
- Robert J. Mislevy, John T. Behrens, Kristen E. Dicerbo, and Roy Levy. 2012. Design and discovery in educational assessment: Evidence-centered design, psychometrics, and educational data mining. JEDM-Journal of Educational Data Mining 4, 1 (2012), 11--48.Google Scholar
- Robert J. Mislevy and Geneva D. Haertel. 2006. Implications of evidence-centered design for educational testing. Educational Measurement: Issues and Practice 25, 4 (December 2006), 6--20. DOI:https://doi.org/10.1111/j.1745-3992.2006.00075.xGoogle Scholar
- National Academy of Education (NAE). 2013. Adaptive Educational Technologies. Washington, DC: National Academy of Education.Google Scholar
- Sandra Y. Okita, Jeremy Bailenson, and Daniel L. Schwartz. 2008. Mere belief in social action improves complex learning. In Proceedings of the 8th International Conference for the Learning Sciences (ICLS’08). Utrecht, The Netherlands: International Society of the Learning Sciences, 132--139.Google Scholar
- Seymour Papert. 1980. Mindstorms: Children, Computers, and Powerful Ideas. New York: Basic Books.Google ScholarDigital Library
- Roy D. Pea. 2004. The social and technological dimensions of scaffolding and related theoretical concepts for learning, education, and human activity. Journal of the Learning Sciences 13, 3 (July 2004), 423--451. DOI:https://doi.org/10.1207/s15327809jls1303_6Google ScholarCross Ref
- Roy D. Pea and D. Midian Kurland. 1984. On the cognitive effects of learning computer programming. New Ideas in Psychology 2 (1984), 137--168. Google ScholarCross Ref
- James W. Pellegrino and Margaret L. Hilton. 2013. Education for Life and Work: Developing Transferable Knowledge and Skills in the 21st Century. National Academies Press.Google Scholar
- Sarah Perez Massey-Allard, J. Butler, D. Ives, J. Bonn, D. Yee, and N. Ido Roll. 2017. Identifying productive inquiry in virtual labs using sequence mining. In Proceedings of the International Conference on Artificial Intelligence in Education. Berlin: Springer Verlag.Google Scholar
- Chris Piech, Jonathan Huang, Andy Nguyen, Mike Phulsuksombati, Mehran Sahami, and Leonidas Guibas. 2015. Learning program embeddings to propagate feedback on student code. In Proceedings of the 32nd International Conference on Machine Learning (ICML’15). Lille, France: JMLR.org, 1093--1102.Google Scholar
- Chris Piech, Mehran Sahami, Daphne Koller, Steve Cooper, and Paulo Blikstein. 2012. Modeling how students learn to program. In Proceedings of the 43rd ACM Technical Symposium on Computer Science Education (SIGCSE’12). New York: ACM, 153--160. DOI:https://doi.org/10.1145/2157136.2157182Google ScholarDigital Library
- Alexander Repenning. 2000. AgentSheets®: An Interactive Simulation Environment with End-User Programmable Agents. Interaction.Google Scholar
- PACT Assessments for Exploring Computer Science. http://pact.sri.com/ecs-assessments.html.Google Scholar
- Cristina Romero and Sebastián Ventura. 2010. Educational data mining: A review of the state of the art. IEEE Transactions on Systems, Man, and Cybernetics 40, 6 (November 2010), 601--618. DOI:https://doi.org/10.1109/TSMCC.2010.2053532Google ScholarDigital Library
- Elizabeth Rowe, Jodi Asbell-Clarke, and Ryan S. Baker. 2015. Serious games analytics to measure implicit science learning. In Serious Game Analytics: Methodologies for Performance Measurement, Assessment, and Improvement, C. S. Loh, Y. Sheng, and D. Ifenthaler (Eds.). Berlin: Springer International Publishing, 343--360. Google ScholarCross Ref
- Andre A. Rupp, Roy Levy, Kristen E. Dicerbo, Shauna J. Sweet, Aaron V. Crawford, Tiago Calico, Martin Benson, Derek Fay, Katie L. Kunze, and John T. Behrens. 2012. Putting ECD into practice: The interplay of theory and data in evidence models within a digital learning environment. JEDM-Journal of Educational Data Mining 4, 1 (October 2012), 49--110.Google Scholar
- Andre A. Rupp, Rebecca Nugent, and Brian Nelson. 2012. Evidence-centered design for diagnostic assessment within digital learning environments: Integrating modern psychometrics and educational data mining. JEDM-Journal of Educational Data Mining 4, 1 (October 2012), 1--10.Google Scholar
- Michael A. Sao Pedro, Ryan S. J. d. Baker, Janice D. Gobert, Orlando Montalvo, and Adam Nakama. 2013. Leveraging machine-learned detectors of systematic inquiry behavior to estimate and predict transfer of inquiry skill. User Modeling and User-Adapted Interaction 23, 1 (March 2013), 1--39. DOI:https://doi.org/10.1007/s11257-011-9101-0Google ScholarDigital Library
- Valerie Shute and Matthew Ventura. 2013. Stealth Assessment: Measuring and Supporting Learning in Video Games. Cambridge, MA: MIT Press.Google ScholarCross Ref
- Elliot Soloway. 1986. Learning to program=learning to construct mechanisms and explanations. Communications of the ACM 29, 9 (September 1986), 850--858. DOI:https://doi.org/10.1145/6592.6594Google ScholarDigital Library
- Elliot Soloway and James C. Spohrer. 1989. Studying the Novice Programmer. Hillsdale, NJ: L. Erlbaum Associates.Google Scholar
- Shauna J. Sweet and Andre A. Rupp. 2012. Using the ECD Framework to support evidentiary reasoning in the context of a simulation study for detecting learner differences in epistemic games. JEDM-Journal of Educational Data Mining 4, 1 (October 2012), 183--223.Google Scholar
- Ian Utting, Stephen Cooper, Michael Kölling, John Maloney, and Mitchel Resnick. 2010. Alice, greenfoot, and scratch--a discussion. ACM Transactions on Computing Education 10, 4 (November 2010), 1--11. DOI:https://doi.org/10.1145/1868358.1868364Google ScholarDigital Library
- Linda Werner, Jill Denner, Shannon Campe, and Damon Chizuru Kawamoto. 2012. The fairy performance assessment: Measuring computational thinking in middle school. In Proceedings of the 43rd ACM Technical Symposium on Computer Science Education (SIGCSE’12). New York: ACM, 215--220. DOI:https://doi.org/10.1145/2157136.2157200Google ScholarDigital Library
- Linda Werner, Charlie McDowell, and Jill Denner. 2013. A first step in learning analytics: Pre-processing low-level Alice logging data of middle school students. Journal of Educational Data Mining 5, 2 (2013), 11--37.Google Scholar
- Jeannette M. Wing. 2006. Computational thinking. Communications of the ACM 49, 3 (2006), 33--35. Google ScholarDigital Library
- Philip H. Winne and Ryan S. J. d. Baker. 2013. The potentials of educational data mining for researching metacognition, motivation and self-regulated learning. JEDM-Journal of Educational Data Mining 5, 1 (2013), 1--8.Google Scholar
- Benjamin Xie and Hal Abelson. 2016. Skill progression in MIT app inventor. In Proceedings of the 2016 IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC’16). 213--217. DOI:https://doi.org/10.1109/VLHCC.2016.7739687Google ScholarCross Ref
- Benjamin Xie, Isra Shabir, and Hal Abelson. 2015. Measuring the usability and capability of app inventor to create mobile applications. In Proceedings of the 3rd International Workshop on Programming for Mobile and Touch (PROMOTO’15). New York: ACM, 1--8. DOI:https://doi.org/10.1145/2824823.2824824Google ScholarDigital Library
- Aman Yadav, David Burkhart, Daniel Moix, Eric Snow, Padmaja Bandaru, and Lissa Clayborn. 2015. Sowing the seeds: A landscape study on assessment in secondary computer science education, comp. Science Teachers Association. New York.Google Scholar
- Iris Zur-Bargury, Bazil Pârv, and Dvir Lanzberg. 2013. A nationwide exam as a tool for improving a new curriculum. In Proceedings of the 18th ACM Conference on Innovation and Technology in Computer Science Education (ITiCSE’13). New York: ACM, 267--272. DOI:https://doi.org/10.1145/2462476.2462479Google ScholarDigital Library
Index Terms
- A Framework for Using Hypothesis-Driven Approaches to Support Data-Driven Learning Analytics in Measuring Computational Thinking in Block-Based Programming Environments
Recommendations
Measuring Student Learning in Introductory Block-Based Programming: Examining Misconceptions of Loops, Variables, and Boolean Logic
SIGCSE '17: Proceedings of the 2017 ACM SIGCSE Technical Symposium on Computer Science EducationProgramming in block-based environments is a key element of introductory computer science (CS) curricula in K-12 settings. Past research conducted in the context of text-based programming points to several challenges related to novice learners' ...
A framework for hypothesis-driven approaches to support data-driven learning analytics in measuring computational thinking in block-based programming
LAK '17: Proceedings of the Seventh International Learning Analytics & Knowledge ConferenceK-12 classrooms use block-based programming environments (BBPEs) for teaching computer science and computational thinking (CT). To support assessment of student learning in BBPEs, we propose a learning analytics framework that combines hypothesis- and ...
Assessing Middle School Students' Computational Thinking Through Programming Trajectory Analysis
SIGCSE '19: Proceedings of the 50th ACM Technical Symposium on Computer Science EducationWith national K-12 education initiatives such as "CSForAll," block-based programming environments have emerged as widely used tools for teaching novice programming. A key challenge presented by block-based programming environments is assessing students' ...
Comments