skip to main content
10.1145/3160489.3160498acmconferencesArticle/Chapter ViewAbstractPublication Pagesaus-ceConference Proceedingsconference-collections
research-article

Comparing sequential and parallel code review techniques for formative feedback

Published: 30 January 2018 Publication History

Abstract

The practice of Peer Review is widespread across a range of academic disciplines. We report on a study that compared two different approaches of peer reviewing program code --- reviewing a sequence of solutions to the same problem (sequential code review), and reviewing a set of multiple solutions side-by-side (parallel code review). We found that the parallel approach was preferred by the majority of participants in the study and there were some indications that it might be more helpful for reviewers, but the sequential approach elicited more written comments in general, and more specific critical comments compared with the parallel approach. Although parallel reviews may be preferred by reviewers, using sequential reviews appears to result in increased levels of formative feedback for the recipient.

References

[1]
Karen Anewalt. 2005. Using Peer Review As a Vehicle for Communication Skill Development and Active Learning. Journal of Computing Sciences in Colleges 21, 2 (Dec. 2005), 148--155. http://dl.acm.org/citation.cfm?id=1089053.1089074
[2]
R. E. Boyatzis. 1998. Thematic Analysis and Code Development: Transforming Qualitative Information. Sage Publications I, London.
[3]
Virginia Braun and Victoria Clarke. 2006. Using Thematic Analysis in Psychology. Qualitative Research in Psychology 3, 2 (2006), 77--101.
[4]
Robert Davies and Teresa Berrow. 1998. An Evaluation of the Use of Computer Supported Peer Review for Developing Higher-level Skills. Computers & Education 30, 1 (1998), 111--115.
[5]
Nancy Falchikov. 1995. Peer Feedback Marking: Developing Peer Assessment. Innovations in Education and Teaching International 32, 2 (1995), 175--187.
[6]
Mark Freeman and Jo McKenzie. 2001. Aligning Peer Assessment with Peer Learning for Large Classes: The Case for an Online Self and Peer Assessment System. In Peer Learning in Higher Education, David Boud, Ruth Cohen, and Jane Sampson (Eds.). Kogan Page, 156--169.
[7]
Sarah Gielen, Elien Peeters, Filip Dochy, Patrick Onghena, and Katrien Struyven. 2010. Improving the Effectiveness of Peer Feedback for Learning. Learning and Instruction 20 (Aug 2010), 304--315.
[8]
John Hamer, Quintin Cutts, Jana Jackova, Andrew Luxton-Reilly, Robert McCartney, Helen Purchase, Charles Riedesel, Mara Saeli, Kate Sanders, and Judithe Sheard. 2008. Contributing Student Pedagogy. SIGCSE Bulletin 40, 4 (2008), 194--212.
[9]
John Hamer, Helen Purchase, Andrew Luxton-Reilly, and Paul Denny. 2015. A Comparison of Peer and Tutor Feedback. Assessment & Evaluation in Higher Education 40, 1 (2015), 151--164.
[10]
John Hamer, Helen C. Purchase, Paul Denny, and Andrew Luxton-Reilly. 2009. Quality of Peer Assessment in CS1. In Proceedings of the 5th International Workshop on Computing Education Research Workshop (ICER '09). ACM, New York, NY, USA, 27--36.
[11]
John Hamer, Helen C. Purchase, Andrew Luxton-Reilly, and Judithe Sheard. 2010. Tools for "Contributing Student Learning". In Proceedings of the 2010 ITiCSE working group reports (ITiCSE-WGR '10). ACM, New York, NY, USA, 1--14.
[12]
Stephanie J. Hanrahan and Geoff Isaacs. 2001. Assessing Self- and Peer-assessment: The Students' Views. Higher Education Research & Development 20 (2001), 53--69.
[13]
Christopher Hundhausen, Anukrati Agrawal, Dana Fairbrother, and Michael Trevisan. 2009. Integrating Pedagogical Code Reviews into a CS 1 Course: An Empirical Study. In Proceedings of the 40th ACM SIGCSE Technical Symposium on Computer Science Education (SIGCSE '09). ACM, New York, NY, USA, 291--295.
[14]
Christopher D. Hundhausen, Pawan Agarwal, and Michael Trevisan. 2011. Online vs. Face-to-face Pedagogical Code Reviews: An Empirical Comparison. In Proceedings of the 42nd SIGCSE Technical Symposium on Computer Science Education (SIGCSE '11). ACM, New York, NY, USA, 117--122.
[15]
Karen Keefe, Judithe Sheard, and Martin Dick. 2006. Adopting XP Practices for Teaching Object Oriented Programming. In Proceedings of the 8th Australasian Computing Education Conference (ACE '06). Australian Computer Society, Inc., Darlinghurst, Australia, 91--100. http://dl.acm.org/citation.cfm?id=1151869.1151882
[16]
Sunny Lin, Eric Liu, and Shyan-Ming Yuan. 2001. Web-based peer assessment: Feedback for students with various thinking styles. Journal of Computer Assisted Learning 17 (Dec 2001), 420--432.
[17]
Andrew Luxton-Reilly. 2009. A systematic review of tools that support peer assessment. Computer Science Education 19, 4 (Dec 2009), 209--232.
[18]
Andrew Luxton-Reilly, Paul Denny, Beryl Plimmer, and Daniel Bertinshaw. 2011. Supporting Student-generated Free-response Questions. In Proceedings of the 16th ACM Conference on Innovation and Technology in Computer Science Education (ITiCSE '11). ACM, New York, NY, USA, 153--157.
[19]
Andrew Luxton-Reilly, Paul Denny, Beryl Plimmer, and Robert Sheehan. 2012. Activities, Affordances and Attitude: How Student-generated Questions Assist Learning. In Proceedings of the 17th ACM Conference on Innovation and Technology in Computer Science Education (ITiCSE '12). ACM, New York, NY, USA, 4--9.
[20]
Russell Mark, Haritos George, and Combes Alan. 2006. Individualising Students' Scores Using Blind and Holistic Peer Assessment. Engineering Education 1, 1 (2006), 50--60.
[21]
Yongwu Miao and Rob Koper. 2007. An Efficient and Flexible Technical Approach to Develop and Deliver Online Peer Assessment. In Proceedings of the 8th International Conference on Computer Supported Collaborative Learning (CSCL '07). International Society of the Learning Sciences, 506--515. http://dl.acm.org/citation.cfm?id=1599600.1599693
[22]
Ken Reily, Pam Ludford Finnerty, and Loren Terveen. 2009. Two Peers Are Better Than One: Aggregating Peer Reviews for Computing Assignments is Surprisingly Accurate. In Proceedings of the ACM 2009 International Conference on Supporting Group Work (GROUP '09). ACM, New York, NY, USA, 115--124.
[23]
Elaine Silva and Dilvan Moreira. 2003. WebCoM: A Tool to Use Peer Review to Improve Student Interaction. Journal on Educational Resources in Computing (JERIC) 3, 1, Article 3 (Mar 2003).
[24]
Keith Topping. 1998. Peer Assessment Between Students in Colleges and Universities. Review of Educational Research 68, 3 (1998), 249--276.
[25]
Scott A. Turner, Ricardo Quintana-Castillo, Manuel A. Pérez-Quiñones, and Stephen H. Edwards. 2008. Misunderstandings About Object-oriented Design: Experiences Using Code Reviews. In Proceedings of the 39th SIGCSE Technical Symposium on Computer Science Education (SIGCSE '08). ACM, New York, NY, USA, 97--101.
[26]
Eira Williams. 1992. Student Attitudes Towards Approaches to Learning and Assessment. Assessment & Evaluation in Higher Education 17, 1 (1992), 45--58.
[27]
William J. Wolfe. 2004. Online Student Peer Reviews. In Proceedings of the 5th Conference on Information Technology Education (CITC5 '04). ACM, New York, NY, USA, 33--37.
[28]
Andreas Zeller. 2000. Making Students Read and Review Code. In Proceedings of the 5th Conference on Innovation and Technology in Computer Science Education (ITiCSE '00). ACM, New York, NY, USA, 89--92.

Cited By

View all
  • (2024)WIP: Code Insight: Combining Code Reading and Debugging Practices for Active Learning in Entry-Level Computer Science Courses2024 IEEE Frontiers in Education Conference (FIE)10.1109/FIE61694.2024.10893296(1-5)Online publication date: 13-Oct-2024
  • (2021)Improving Student Peer Code Review Using GamificationProceedings of the 23rd Australasian Computing Education Conference10.1145/3441636.3442308(80-87)Online publication date: 2-Feb-2021
  • (2020)Software Testing as Medium for Peer FeedbackUnited Kingdom & Ireland Computing Education Research conference.10.1145/3416465.3416474(66-72)Online publication date: 3-Sep-2020
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ACE '18: Proceedings of the 20th Australasian Computing Education Conference
January 2018
127 pages
ISBN:9781450363402
DOI:10.1145/3160489
  • Conference Chairs:
  • Raina Mason,
  • Simon
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 30 January 2018

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. CS1
  2. code review
  3. formative feedback
  4. novice programmers
  5. parallel code review
  6. peer assessment
  7. peer feedback
  8. peer review
  9. sequential code review

Qualifiers

  • Research-article

Conference

ACE 2018
Sponsor:
  • Southern Cross UNIVERSITY
  • SIGCSE
ACE 2018: 20th Australasian Computing Education Conference
January 30 - February 2, 2018
Queensland, Brisbane, Australia

Acceptance Rates

ACE '18 Paper Acceptance Rate 14 of 36 submissions, 39%;
Overall Acceptance Rate 161 of 359 submissions, 45%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)12
  • Downloads (Last 6 weeks)0
Reflects downloads up to 03 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)WIP: Code Insight: Combining Code Reading and Debugging Practices for Active Learning in Entry-Level Computer Science Courses2024 IEEE Frontiers in Education Conference (FIE)10.1109/FIE61694.2024.10893296(1-5)Online publication date: 13-Oct-2024
  • (2021)Improving Student Peer Code Review Using GamificationProceedings of the 23rd Australasian Computing Education Conference10.1145/3441636.3442308(80-87)Online publication date: 2-Feb-2021
  • (2020)Software Testing as Medium for Peer FeedbackUnited Kingdom & Ireland Computing Education Research conference.10.1145/3416465.3416474(66-72)Online publication date: 3-Sep-2020
  • (2020)A Review of Peer Code Review in Higher EducationACM Transactions on Computing Education10.1145/340393520:3(1-25)Online publication date: 28-Sep-2020
  • (2020)Computer-supported Collaborative Learning in Programming Education: A Systematic Literature Review2020 IEEE Global Engineering Education Conference (EDUCON)10.1109/EDUCON45650.2020.9125237(1086-1095)Online publication date: Apr-2020
  • (2020)Peer Validation and Generation Tool for Question Banks in Learning Management SystemsApplied Informatics10.1007/978-3-030-61702-8_30(435-448)Online publication date: 19-Oct-2020
  • (2019)Individual, Social and Personnel Factors Influencing Modern Code Review Process2019 IEEE Conference on Open Systems (ICOS)10.1109/ICOS47562.2019.8975708(40-45)Online publication date: Nov-2019
  • (2019)Situational factors affecting Software Engineers Sustainability: A Vision of Modern Code Review2019 IEEE 6th International Conference on Engineering Technologies and Applied Sciences (ICETAS)10.1109/ICETAS48360.2019.9117366(1-6)Online publication date: Dec-2019
  • (2019)Understanding the Impact of Feedback on Knowledge Sharing in Modern Code Review2019 IEEE 6th International Conference on Engineering Technologies and Applied Sciences (ICETAS)10.1109/ICETAS48360.2019.9117268(1-5)Online publication date: Dec-2019

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media