skip to main content
10.1145/2526968.2526970acmotherconferencesArticle/Chapter ViewAbstractPublication Pageskoli-callingConference Proceedingsconference-collections
research-article

Recording and analyzing in-browser programming sessions

Published:14 November 2013Publication History

ABSTRACT

In this paper, we report on the analysis of a novel type of automatically recorded detailed programming session data collected on a university-level web programming course. We present a method and an implementation of collecting rich data on how students learning to program edit and execute code and explore its use in examining learners' behavior. The data collection instrument is an in-browser Python programming environment that integrates an editor, an execution environment, and an interactive Python console and is used to deliver programming assignments with automatic feedback. Most importantly, the environment records learners' interaction within it. We have implemented tools for viewing these traces and demonstrate their potential in learning about the programming processes of learners and of benefiting computing education research and the teaching of programming.

References

  1. M. Ahmadzadeh, D. Elliman, and C. Higgins. An Analysis of Patterns of Debugging among Novice Computer Science Students. ACM SIGCSE Bulletin, 37(3): 84--88, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. A. Allevato and S. H. Edwards. Discovering Patterns in Student Activity on Programming Assignments. In 2010 ASEE Southeastern Section Annual Conference and Meeting, 2010.Google ScholarGoogle Scholar
  3. E. Balzuweit and J. Spacco. SnapViz: Visualizing Programming Assignment Snapshots. In Proceedings of the 18th ACM conference on Innovation and technology in computer science education, pages 350--350, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. S. H. Edwards, J. Snyder, M. A. Pérez-Quiñones, A. Allevato, D. Kim, and B. Tretola. Comparing Effective and Ineffective Behaviors of Student Programmers. In Proceedings of the fifth international workshop on Computing education research workshop, pages 3--14, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. N. J. Falkner and K. E. Falkner. A Fast Measure for Identifying At-Risk Students in Computer Science. In Proceedings of the ninth annual international conference on International computing education research, pages 55--62, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. J. Fenwick Jr., C. Norris, F. Barry, J. Rountree, C. Spicer, and S. Cheek. Another Look at the Behaviors of Novice Programmers. In Proceedings of the 40th ACM Technical Symposium on Computer Science Education, pages 296--300, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. J. Jackson, M. Cobb, and C. Carver. Identifying Top Java Errors for Novice Programmers. In Proceedings of the 35th Annual Conference on Frontiers in Education, 2005.Google ScholarGoogle ScholarCross RefCross Ref
  8. M. Jadud. A First Look at Novice Compilation Behaviour Using BlueJ. Computer Science Education, 15(1): 25--40, 2005.Google ScholarGoogle ScholarCross RefCross Ref
  9. M. Jadud. Methods and Tools for Exploring Novice Compilation Behaviour. In Proceedings of the Second International Workshop on Computing Education Research, pages 73--84, 2006. Google ScholarGoogle ScholarDigital LibraryDigital Library
  10. M. Jadud and P. Henriksen. Flexible, Reusable Tools for Studying Novice Programmers. In Proceedings of the fifth international workshop on Computing education research workshop, pages 37--42, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. U. Kiesmüller, S. Sossalla, T. Brinda, and K. Riedhammer. Online Identification of Learner Problem Solving Strategies Using Pattern Recognition Methods. In Proceedings of the fifteenth annual conference on Innovation and technology in computer science education, pages 274--278, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. K. Mierle, K. Laven, S. Roweis, and G. Wilson. Mining Student CVS Repositories for Performance Indicators. ACM SIGSOFT Software Engineering Notes, 30(4): 1--5, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  13. C. Murphy, G. Kaiser, K. Loveland, and S. Hasan. Retina: Helping Students and Instructors based on Observed Programming Activities. In Proceedings of the 40th ACM Technical Symposium on Computer Science Education, pages 178--182, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  14. C. Norris, F. Barry, J. B. Fenwick Jr, K. Reid, and J. Rountree. ClockIt: Collecting Quantitative Data on How Beginning Software Developers Really Work. ACM SIGCSE Bulletin, 40(3): 37--41, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  15. D. N. Perkins, C. Hancock, R. Hobbs, F. Martin, and R. Simmons. Conditions of Learning in Novice Programmers. Journal of Educational Computing Research, 2(1): 37--55, 1986.Google ScholarGoogle ScholarCross RefCross Ref
  16. C. Piech, M. Sahami, D. Koller, S. Cooper, and P. Blikstein. Modeling How Students Learn to Program. In Proceedings of the 43rd ACM technical symposium on Computer Science Education, pages 153--160, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library
  17. M. Rodrigo and R. Baker. Coarse-Grained Detection of Student Frustration in an Introductory Programming Course. In Proceedings of the fifth international workshop on Computing education research workshop, pages 75--80, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  18. M. M. T. Rodrigo, R. S. Baker, M. C. Jadud, A. C. M. Amarra, T. Dy, M. B. V. Espejo-Lahoz, S. A. L. Lim, S. A. Pascua, J. O. Sugay, and E. S. Tabanao. Affective and Behavioral Predictors of Novice Programmer Achievement. ACM SIGCSE Bulletin, 41(3): 156--160, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  19. J. Spacco, D. Fossati, J. Stamper, and K. Rivers. Towards Improving Programming Habits to Create Better Computer Science Course Outcomes. In Proceedings of the 18th ACM conference on Innovation and technology in computer science education, pages 243--248, 2013. Google ScholarGoogle ScholarDigital LibraryDigital Library
  20. J. Spacco, J. Strecker, D. Hovemeyer, and W. Pugh. Software Repository Mining with Marmoset: An Automated Programming Project Snapshot and Testing System. ACM SIGSOFT Software Engineering Notes, 30(4): 1--5, 2005. Google ScholarGoogle ScholarDigital LibraryDigital Library
  21. J. C. Spohrer and E. Soloway. Novice Mistakes: Are the Folk Wisdoms Correct? Communications of the ACM, 29(7): 624--632, 1986. Google ScholarGoogle ScholarDigital LibraryDigital Library
  22. J. G. Spohrer and E. Soloway. Analyzing the high frequency bugs in novice programs. Empirical Studies of Programmers, 1986. Google ScholarGoogle ScholarDigital LibraryDigital Library
  23. E. Tabanao, M. Rodrigo, and M. Jadud. Identifying At-Risk Novice Programmers through the Analysis of Online Protocols. In Philippine Computing Society Congress 2008, 2008.Google ScholarGoogle Scholar
  24. E. Tabanao, M. Rodrigo, and M. Jadud. Predicting At-Risk Novice Java Programmers Through the Analysis of Online Protocols. In Proceedings of the seventh international workshop on Computing education research, pages 85--92, 2011. Google ScholarGoogle ScholarDigital LibraryDigital Library
  25. I. Utting, N. Brown, M. Kölling, D. McCall, and P. Stevens. Web-Scale Data Gathering with BlueJ. In Proceedings of the ninth annual international conference on International computing education research, pages 1--4, 2012. Google ScholarGoogle ScholarDigital LibraryDigital Library

Index Terms

  1. Recording and analyzing in-browser programming sessions

    Recommendations

    Reviews

    Donald J. Bagert

    The analysis of automatically recorded programming session data collected during a college level course on web programming is the primary focus of this paper. Data is collected through an in-browser Python integrated development environment (IDE) with an editor, a Python console, and an execution environment. The authors' goal is to demonstrate the potential benefit to both computing education research and to teaching programming through collecting data on the programming processes of introductory-level college students. The overall content of this paper is background intensive; half of the ten-page paper is devoted to previous research involving automatically recorded programming sessions. This degree of background is very useful, but seems more suited to a much longer journal paper. The next section (one page) is used to describe the data collection environment and the experiment, which involves students doing three different programming assignments and receiving feedback through unit tests provided by the instructors. Most of the remainder of the paper focuses on the analysis and results stemming from this experiment. Figures with examples of screen captures of the console during programming are very useful. This is a good conference paper. Hopefully, the authors will be able to publish their results in a journal, which will provide them the avenue to discuss more of the details and results of their research. Online Computing Reviews Service

    Access critical reviews of Computing literature here

    Become a reviewer for Computing Reviews.

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Other conferences
      Koli Calling '13: Proceedings of the 13th Koli Calling International Conference on Computing Education Research
      November 2013
      204 pages
      ISBN:9781450324823
      DOI:10.1145/2526968

      Copyright © 2013 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 14 November 2013

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Koli Calling '13 Paper Acceptance Rate20of40submissions,50%Overall Acceptance Rate80of182submissions,44%

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader