skip to main content
10.1145/3428029.3428040acmotherconferencesArticle/Chapter ViewAbstractPublication Pageskoli-callingConference Proceedingsconference-collections
research-article

Exploring the Bug Investigation Techniques of Intermediate Student Programmers

Published: 22 November 2020 Publication History

Abstract

Bug investigation – testing and debugging – is a significant part of software development. Ineffective practices of bug investigation can greatly hinder project development. Therefore, we seek to understand bug investigation practices among intermediate student programmers. To this end, we used a mixed-methods approach to study the testing and debugging practices of students in a junior-level Data Structures course with 3–4 week long projects. First, we interviewed 12 students of varying project performances. From the interviews, we identified five techniques that students use for both testing and debugging: 1) writing diagnostic print statements, 2) unit testing, 3) using source-level debugger, 4) submission to online auto-grader, and 5) manual tracing. Using the Grounded Theory approach, we developed four hypotheses regarding students’ use of multiple techniques and their possible impact on performance. We used clickstream data to analyze the level of use of the first four of these techniques. We found that over 92%, 87%, and 73% of the students used JUnit testing, diagnostic print statements, and the source-level debugger, respectively. We found that the majority of the students (91%) used more than one technique to investigate bugs in their projects. Moreover, students who used multiple techniques had overall better performance in the projects. Finally, we identified some ineffective practices correlated with lower project scores. We believe that the findings of our research will help understand, characterize, and teach better practices of bug investigation.

References

[1]
Marzieh Ahmadzadeh, Dave Elliman, and Colin Higgins. 2005. An analysis of patterns of debugging among novice computer science students. In Proceedings of the 10th annual SIGCSE conference on Innovation and technology in computer science education. 84–88.
[2]
M Aniche. 2012. RepoDriller.
[3]
Keijiro Araki, Zengo Furukawa, and Jingde Cheng. 1991. A general framework for debugging. IEEE software 8, 3 (1991), 14–20.
[4]
Moritz Beller, Niels Spruit, Diomidis Spinellis, and Andy Zaidman. 2018. On the Dichotomy of Debugging Behavior Among Programmers. In Proceedings of the 40th International Conference on Software Engineering(ICSE ’18). 572–583. https://doi.org/10.1145/3180155.3180175
[5]
Ryan Chmiel and Michael C Loui. 2004. Debugging: from novice to expert. ACM SIGCSE Bulletin 36, 1 (2004), 17–21.
[6]
Hubert L Dreyfus and Stuart E Dreyfus. 1986. The power of human intuition and expertise in the era of the computer. Mind over machine. Nueva York: The Free Press (1986).
[7]
Stephen H Edwards, Jason Snyder, Manuel A Pérez-Quiñones, Anthony Allevato, Dongkwan Kim, and Betsy Tretola. 2009. Comparing effective and ineffective behaviors of student programmers. In Proceedings of the fifth international workshop on Computing education research workshop. 3–14.
[8]
James B Fenwick Jr, Cindy Norris, Frank E Barry, Josh Rountree, Cole J Spicer, and Scott D Cheek. 2009. Another look at the behaviors of novice programmers. ACM SIGCSE Bulletin 41, 1 (2009), 296–300.
[9]
Sue Fitzgerald, Gary Lewandowski, Renee McCauley, Laurie Murphy, Beth Simon, Lynda Thomas, and Carol Zander. 2008. Debugging: finding, fixing and flailing, a multi-institutional study of novice debuggers. Computer Science Education 18, 2 (2008), 93–116.
[10]
Sue Fitzgerald, Renée McCauley, Brian Hanks, Laurie Murphy, Beth Simon, and Carol Zander. 2009. Debugging from the student perspective. IEEE Transactions on Education 53, 3 (2009), 390–396.
[11]
Leo Gugerty and Gary Olson. 1986. Debugging by skilled and novice programmers. In Proceedings of the SIGCHI conference on human factors in computing systems. 171–174.
[12]
Brent Hailpern and Padmanabhan Santhanam. 2002. Software debugging, testing, and verification. IBM Systems Journal 41, 1 (2002), 4–12.
[13]
Irvin R Katz and John R Anderson. 1987. Debugging: An analysis of bug-location strategies. Human-Computer Interaction 3, 4 (1987), 351–399.
[14]
Ayaan M Kazerouni, Stephen H Edwards, T Simin Hall, and Clifford A Shaffer. 2017. DevEventTracker: Tracking development events to assess incremental development and procrastination. In Proceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science Education. 104–109.
[15]
Ayaan M Kazerouni, Stephen H Edwards, and Clifford A Shaffer. 2017. Quantifying incremental development practices and their relationship to procrastination. In Proceedings of the 2017 ACM Conference on International Computing Education Research. 191–199.
[16]
Laurie Murphy, Gary Lewandowski, Renée McCauley, Beth Simon, Lynda Thomas, and Carol Zander. 2008. Debugging: the good, the bad, and the quirky–a qualitative analysis of novices’ strategies. ACM SIGCSE Bulletin 40, 1 (2008), 163–167.
[17]
G Nagy and MC Pennebaker. 1971. A step toward automatic analysis of logically undetectable programming errors. In Technical Report RC 3407. IBM Thomas J. Watson Research Center Yorktown Heights, NY.
[18]
Devon H O’Dell. 2017. The debugging mindset. Queue 15, 1 (2017), 71–90.
[19]
Michael Perscheid, Benjamin Siegmund, Marcel Taeumel, and Robert Hirschfeld. 2017. Studying the advancement in debugging practice of professional software developers. Software Quality Journal 25, 1 (2017), 83–110.
[20]
Michael J Scott and Gheorghita Ghinea. 2013. On the domain-specificity of mindsets: The relationship between aptitude beliefs and programming practice. IEEE Transactions on Education 57, 3 (2013), 169–174.
[21]
Anselm Strauss and Juliet Corbin. 1994. Grounded theory methodology. Handbook of qualitative research 17 (1994), 273–85.
[22]
Min Xie and Bo Yang. 2003. A study of the effect of imperfect debugging on software development cost. IEEE Transactions on Software Engineering 29, 5 (2003), 471–473.
[23]
Andreas Zeller. 2009. Why programs fail: a guide to systematic debugging. Elsevier.

Cited By

View all
  • (2025)An Analysis of Students' Testing Processes in CS1Proceedings of the 56th ACM Technical Symposium on Computer Science Education V. 110.1145/3641554.3701890(46-52)Online publication date: 12-Feb-2025
  • (2024)Testing and Debugging Habits of Intermediate Student Programmers2024 IEEE Global Engineering Education Conference (EDUCON)10.1109/EDUCON60312.2024.10578650(1-10)Online publication date: 8-May-2024
  • (2023)An Empirical Evaluation of Live Coding in CS1Proceedings of the 2023 ACM Conference on International Computing Education Research - Volume 110.1145/3568813.3600122(476-494)Online publication date: 7-Aug-2023
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
Koli Calling '20: Proceedings of the 20th Koli Calling International Conference on Computing Education Research
November 2020
295 pages
ISBN:9781450389211
DOI:10.1145/3428029
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 22 November 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. CS education
  2. bug investigation
  3. data structures and algorithms
  4. debugging
  5. post-CS2
  6. testing

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

Koli Calling '20

Acceptance Rates

Overall Acceptance Rate 80 of 182 submissions, 44%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)38
  • Downloads (Last 6 weeks)4
Reflects downloads up to 02 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2025)An Analysis of Students' Testing Processes in CS1Proceedings of the 56th ACM Technical Symposium on Computer Science Education V. 110.1145/3641554.3701890(46-52)Online publication date: 12-Feb-2025
  • (2024)Testing and Debugging Habits of Intermediate Student Programmers2024 IEEE Global Engineering Education Conference (EDUCON)10.1109/EDUCON60312.2024.10578650(1-10)Online publication date: 8-May-2024
  • (2023)An Empirical Evaluation of Live Coding in CS1Proceedings of the 2023 ACM Conference on International Computing Education Research - Volume 110.1145/3568813.3600122(476-494)Online publication date: 7-Aug-2023
  • (2023)Exploring the Impact of Cognitive Awareness Scaffolding for Debugging in an Introductory Programming ClassProceedings of the 54th ACM Technical Symposium on Computer Science Education V. 110.1145/3545945.3569871(1007-1013)Online publication date: 2-Mar-2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

HTML Format

View this article in HTML Format.

HTML Format

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media