skip to main content
10.1145/3328778.3366849acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article

A Principled Approach to Designing a Computational Thinking Practices Assessment for Early Grades

Published:26 February 2020Publication History

ABSTRACT

In today's increasingly digital world, it is critical that all students learn to think computationally from an early age. Assessments of Computational Thinking (CT) are essential for capturing information about student learning and challenges. Several existing K-12 CT assessments focus on concepts like variables, iterations and conditionals without emphasizing practices like algorithmic thinking, reusing and remixing, and debugging. In this paper, we discuss the development of and results from a validated CT Practices assessment for 4th-6th grade students. The assessment tasks are multilingual, shifting the focus to CT practices, and making the assessment useful for students using different CS curricula and different programming languages. Results from an implementation of the assessment with about 15000 upper elementary students in Hong Kong indicate challenges with algorithm comparison given constraints, deciding when code can be reused, and choosing debugging test cases. These results point to the utility of our assessment as a curricular tool and the need for emphasizing CT practices in future curricular initiatives and teacher professional development.

References

  1. Aho, A. V. (2012). Computation and computational thinking. The Computer Journal, 55(7), 832--835.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Berland, M., Martin, T., Benton, T., Petrick Smith, C., & Davis, D. (2013). Using learning analytics to understand the learning pathways of novice programmers. Journal of the Learning Sciences, 22(4), 564--599.Google ScholarGoogle ScholarCross RefCross Ref
  3. Bienkowski, M., Snow, E., Rutstein, D. W., & Grover, S. (2015). Assessment design patterns for computational thinking practices in secondary computer science: A first look (SRI technical report).Google ScholarGoogle Scholar
  4. Blikstein, P., Worsley, M., Piech, C., Sahami, M., Cooper, S., & Koller, D. (2014). Programming pluralism: Using learning analytics to detect patterns in the learning of computer programming. Journal of the Learning Sciences, 23(4), 561--599.Google ScholarGoogle ScholarCross RefCross Ref
  5. Brennan, K., & Resnick, M. (2012). New frameworks for studying and assessing the development of computational thinking. Paper presented at the 2012 Annual Meeting of the American Educational Research Association, Vancouver, Canada.Google ScholarGoogle Scholar
  6. Childs, R. A, Dings, J., R., & Kingston, N (2002), The effect of Matrix Sampling on Student Score Comparability in Constructed -- Response and Multiple- Choice Assessment. Washington DC: Council of Chief State School Officers.Google ScholarGoogle Scholar
  7. Dagien, V., & Futschek, G. (2008, July). Bebras international contest on informatics and computer literacy: Criteria for good tasks. In International conference on informatics in secondary schools-evolution and perspectives (pp. 19--30). Springer, Berlin, Heidelberg.Google ScholarGoogle Scholar
  8. Dorn, B., Tew, A. E., & Guzdial, M. (2007, September). Introductory computing construct use in an end-user programming community. In IEEE Symposium on Visual Languages and Human-Centric Computing (VL/HCC 2007) (pp. 27--32). IEEE.Google ScholarGoogle ScholarDigital LibraryDigital Library
  9. Ericson, B. Evaluation for App Inventor. https://studylib.net/doc/7006187/app-inventor-eval-v2Google ScholarGoogle Scholar
  10. Grover, S., & Basu, S. (2017, March). Measuring student learning in introductory block-based programming: Examining misconceptions of loops, variables, and boolean logic. In Proceedings of the 2017 ACM SIGCSE technical symposium on computer science education (pp. 267--272). ACM.Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. Grover, S., Basu, S., Bienkowski, M., Eagle, M., Diana, N., & Stamper, J. (2017). A framework for using hypothesis-driven approaches to support data-driven learning analytics in measuring computational thinking in block-based programming environments. ACM Transactions on Computing Education (TOCE), 17(3), 14.Google ScholarGoogle Scholar
  12. K-12 Computer Science Framework (2016). Retrieved from http://www.k12cs.orgGoogle ScholarGoogle Scholar
  13. Koh, K. H., Basawapatna, A., Bennett, V., & Repenning, A. (2010, September). Towards the automatic recognition of computational thinking for adaptive visual language learning. In 2010 IEEE Symposium on Visual Languages and Human-Centric Computing (pp. 59--66). IEEE.Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. Mislevy, R. J., & Haertel, G. D. (2006). Implications of evidence-centered design for educational testing. Educational Measurement: Issues and Practice, 25(4), 6--20.Google ScholarGoogle ScholarCross RefCross Ref
  15. National Research Council. 2001. Knowing What Students Know: The Science and Design of Educational Assessment. Washington, DC: The National Academies Press. https://doi.org/10.17226/10019.Google ScholarGoogle Scholar
  16. NGSS Lead States. (2013). Next Generation Science Standards: For states, by states. Washington, DC: National Academies Press.Google ScholarGoogle Scholar
  17. Román-González, M., Pérez-González, J.-C. and Jiménez-Fernández, C. Which cognitive abilities underlie computational thinking? Criterion validity of the Computational Thinking Test. Computers in Human Behavior 72 (2016), 678--691.Google ScholarGoogle Scholar
  18. Seiter, L., & Foreman, B. (2013, August). Modeling the learning progressions of computational thinking of primary grade students. In Proceedings of the ninth annual international ACM conference on International computing education research (pp. 59--66). ACM.Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. Snow, E., Rutstein, D., Bienkowski, M., & Xu, Y. (2017, August). Principled assessment of student learning in high school computer science. In Proceedings of the 2017 ACM Conference on International Computing Education Research (pp. 209--216). ACM.Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. Tew, A.E. and Guzdial, M. 2011. The FCS1: a language independent assessment of CS1 knowledge. Proc. of the 42nd Annual ACM SIGCSE Conference (2011), 111--116.Google ScholarGoogle Scholar
  21. Weintrop, D., & Wilensky, U. (2015, July). Using Commutative Assessments to Compare Conceptual Understanding in Blocks-based and Text-based Programs. In ICER (Vol. 15, pp. 101--110).Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. Werner, L., Denner, J., Campe, S., & Kawamoto, D. C. (2012, February). The fairy performance assessment: measuring computational thinking in middle school. In Proceedings of the 43rd ACM technical symposium on Computer Science Education (pp. 215--220). ACM.Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. Wiebe, E., London, J., Aksit, O., Mott, B. W., Boyer, K. E., & Lester, J. C. (2019, February). Development of a Lean Computational Thinking Abilities Assessment for Middle Grades Students. In Proceedings of the 50th ACM Technical Symposium on Computer Science Education (pp. 456--461). ACM.Google ScholarGoogle ScholarDigital LibraryDigital Library
  24. Wing, J. M. (2006). Computational Thinking. Communications of the ACM, 49(3), 33--35. doi:10.1145/1118178.1118215Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. Yadav, A., Burkhart, D., Moix, D., Snow, E., Bandaru, P., & Clayborn, L. (2015). Sowing the seeds: A landscape study on assessment in secondary computer science education. Comp. Sci. Teachers Assn., NY, NY.Google ScholarGoogle Scholar

Index Terms

  1. A Principled Approach to Designing a Computational Thinking Practices Assessment for Early Grades

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          SIGCSE '20: Proceedings of the 51st ACM Technical Symposium on Computer Science Education
          February 2020
          1502 pages
          ISBN:9781450367936
          DOI:10.1145/3328778

          Copyright © 2020 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 26 February 2020

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • research-article

          Acceptance Rates

          Overall Acceptance Rate1,595of4,542submissions,35%

          Upcoming Conference

          SIGCSE Virtual 2024
          SIGCSE Virtual 2024: ACM Virtual Global Computing Education Conference
          November 30 - December 1, 2024
          Virtual Event , USA

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader