Skip to main content

Advertisement

Log in

A case study investigating programming students’ peer review of codes and their perceptions of the online learning environment

  • Published:
Education and Information Technologies Aims and scope Submit manuscript

Abstract

Programming in schools is no longer a novel subject. It is now quite commonly found in our schools either in formal or informal curriculum. Programmers use creative learning tactics to solve problems and communicate ideas. Learning to program is generally considered challenging. Developing and implementing new methodologies in teaching programming is imperative to overcome the current challenges associated with teaching and learning of programming. This case study aims to contribute to the programming education in schools by investigating how students learn in an online programming while involved in peer review of codes. The study subsequently examines students’ perceptions of the pedagogical, social and technical design of the online programming learning environment. When students are involved in providing and receiving feedback and creating their own identity in a programming community, they may be better prepared for learning and applying programming in their undergraduate studies and their future career in the field.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  • Agalianos, A., Noss, R., & Whitty, G. (2001). Logo in mainstream schools: The struggle over the soul of an educational innovation. British Journal of Sociology of Education, 22(4), 479–500.

    Google Scholar 

  • Awuah, L. J. (2015). Supporting 21st-century teaching and learning: The role of Google apps for education (GAFE). Journal of Instructional Research, 4, 12–22.

    Google Scholar 

  • Barr, V., & Guzdial, M. (2015). Advice on teaching CS, and the learnability of programming languages. Communications of the ACM, 58(3), 8–9.

    Google Scholar 

  • Blau, I., & Caspi, A. (2008). Don’t edit, discuss! The influence of wiki editing on learning experience and achievement. In D. Ben-Zvi (Ed.), Innovative e-learning in higher education (pp. 19–23). Haifa: University of Haifa.

    Google Scholar 

  • Boud, D., Cohen, R., & Sampson, J. (1999). Peer learning and assessment. Assessment & Evaluation in Higher Education, 24(4), 413–426.

    Google Scholar 

  • Brennan, K., & Resnick, M. (2012). New frameworks for studying and assessing the development of computational thinking. In Proceedings of the 2012 annual meeting of the American Educational Research Association, Vancouver, Canada (Vol. 1, p. 25).

  • Brown, M. E., & Hocutt, D. L. (2015). Learning to use, useful for learning: A usability study of Google apps for education. Journal of Usability Studies, 10(4), 160–181.

    Google Scholar 

  • Cheng, K. H., Liang, J. C., & Tsai, C. C. (2015). Examining the role of feedback messages in undergraduate students' writing performance during an online peer assessment activity. The Internet and Higher Education, 25, 78–84.

    Google Scholar 

  • Chi, M. T. (1997). Quantifying qualitative analyses of verbal data: A practical guide. The Journal of the Learning Sciences, 6(3), 271–315.

    Google Scholar 

  • Chiang, F. K., & Qin, L. (2018). A pilot study to assess the impacts of game-based construction learning, using scratch, on students’ multi-step equation-solving performance. Interactive Learning Environments, 26(6), 803–814.

    Google Scholar 

  • Cho, K., Chung, T. R., King, W. R., & Schunn, C. (2008). Peer-based computer-supported knowledge refinement: An empirical investigation. Communications of the ACM, 51(3), 83–88.

    Google Scholar 

  • Connelly, S. (2017). Top skills you need in your arsenal to ride the augmented reality wave. Retrieved November 2, 2018 from https://blog.sparksgroupinc.com/candidate/blogs/candidate/augmented-reality/top-ar-skills. Accessed 2 Nov 2018.

  • Crane, G. E. (2016). Leveraging Digital Communications Technology in Higher Education: Exploring URI’s Adoption of Google Apps for Education 2015.

  • Creswell, J. W. (2014). Educational research: Planning, conducting and evaluating quantitative and qualitative research (4th ed.). London: Pearson Publications Ltd..

    Google Scholar 

  • Dickes, A. C., & Sengupta, P. (2013). Learning natural selection in 4th grade with multi-agent-based computational models. Research in Science Education, 43(3), 921–953.

    Google Scholar 

  • Fahy, P. J. (2001). Addressing some common problems in transcript analysis. The International Review of Research in Open and Distributed Learning, 1(2). Retrieved September 5, 2018 from http://www.irrodl.org/index.php/irrodl/article/view/321/530. Accessed 5 Sep 2018.

  • Gelder, T. V. (2005). Teaching critical thinking: Some lessons from cognitive science. College Teaching, 53(1), 41–48.

    Google Scholar 

  • Gibson, J. J. (1966). The senses considered as perceptual systems. Boston: Houghton Mifflin.

    Google Scholar 

  • Glassner, A., Weinstock, M., & Neuman, Y. (2005). Pupils' evaluation and generation of evidence and explanation in argumentation. British Journal of Educational Psychology, 75(1), 105–118.

    Google Scholar 

  • Google for Education (n.d.). Retrieved August 10, 2018 from https://edu.google.com/

  • Grover, S., & Pea, R. (2013). Computational thinking in K–12: A review of the state of the field. Educational Researcher, 42(1), 38–43.

    Google Scholar 

  • Grover, S., Pea, R., & Cooper, S. (2015). Designing for deeper learning in a blended computer science course for middle school students. Computer Science Education, 25(2), 199–237.

    Google Scholar 

  • Han, B., Bae, Y., & Park, J. (2016). The effect of mathematics achievement variables on scratch programming activities of elementary school students. International Journal of Software Engineering and Its Applications, 10(12), 21–30.

    Google Scholar 

  • Heggart, K. R., & Yoo, J. (2018). Getting the Most from Google classroom: A pedagogical framework for tertiary educators. Australian Journal of Teacher Education, 43(3), 9.

    Google Scholar 

  • Hsia, L. H., Huang, I., & Hwang, G. J. (2016). A web-based peer-assessment approach to improving junior high school students' performance, self-efficacy and motivation in performing arts courses. British Journal of Educational Technology, 47(4), 618–632.

    Google Scholar 

  • Kafai, Y. B., & Burke, Q. (2013). Computer programming goes back to school. Phi Delta Kappan, 95(1), 61–65.

  • Klein, R., Orelup, R., & Smith, M. (2012). Google apps for education: Valparaiso University's migration experience. In Proceedings of the 40th annual ACM SIGUCCS conference on User services (pp. 203-208). ACM.

  • Koul, R., Fraser, B., & Nastiti, H. (2018). Transdisciplinary instruction: Implementing and evaluating a primary-school STEM teaching model. International Journal of Innovation in Science and Mathematics Education (formerly CAL-laborate International), 26(8), 17–29.

    Google Scholar 

  • Lai, C. S., & Lai, M. H. (2012, June). Using computer programming to enhance science learning for 5th graders in Taipei. In Computer, Consumer and Control (IS3C), 2012 International Symposium on (pp. 146-148). IEEE.

  • Lee, E. Y., Chan, C. K., & van Aalst, J. (2006). Students assessing their own collaborative knowledge building. International Journal of Computer-Supported Collaborative Learning, 1(1), 57–87.

    Google Scholar 

  • Lindh, M., Nolin, J., & Hedvall, K. N. (2016). Pupils in the clouds: Implementation of Google apps for education. First Monday, 21(4).

  • Liu, N. F., & Carless, D. (2006). Peer feedback: The learning element of peer assessment. Teaching in Higher Education, 11(3), 279–290.

    Google Scholar 

  • Liu, E. Z. F., Lin, S. S., Chiu, C. H., & Yuan, S. M. (2001). Web-based peer review: The learner as both adapter and reviewer. IEEE Transactions on Education, 44(3), 246–251.

    Google Scholar 

  • Lye, S. Y., & Koh, J. H. L. (2014). Review on teaching and learning of computational thinking through programming: What is next for K-12? Computers in Human Behavior, 41, 51–61.

    Google Scholar 

  • McHugh, M. L. (2012). Interrater reliability: The kappa statistic. Biochemia medica: Biochemia medica, 22(3), 276–282.

    MathSciNet  Google Scholar 

  • Miles, M. B., & Huberman, A. M. (1994). Qualitative data analysis: An expanded sourcebook. Thousand Oaks: Sage.

    Google Scholar 

  • Papert, S. (1980). Mindstorms: Children, computers, and powerful ideas. New York: Basic Books.

    Google Scholar 

  • Pond, K., Ul-Haq, R., & Wade, W. (1995). Peer review: A precursor to peer assessment. Innovations in Education and Training International, 32(4), 314–323.

    Google Scholar 

  • Price, E., Goldberg, F., Robinson, S., & McKean, M. (2016). Validity of peer grading using calibrated peer review in a guided-inquiry, conceptual physics course. Physical Review Physics Education Research, 12(2), 020145.

    Google Scholar 

  • Quek, C. L., & Wang, Q. (2014). Exploring teachers’ perceptions of wikis for learning classroom cases. The Australian Journal of Teacher Education, 39(2), 100–120.

    Google Scholar 

  • Resnick, M. (2013). Learn to code, code to learn. EdSurge, May. Retrieved July 30, 2018 from https://www.edsurge.com/news/2013-05-08-learn-to-code-code-to-learn. Accessed 30 Jul 2018.

  • Rick, J., & Guzdial, M. (2006). Situating CoWeb: A scholarship of application. International Journal of Computer-Supported Collaborative Learning, 1(1), 89–115.

    Google Scholar 

  • Robertson, C. (2013). Using a cloud-based computing environment to support teacher training on common core implementation. TechTrends, 57(6), 57–60.

    Google Scholar 

  • Roth, W. M. (1997). From everyday science to science education: How science and technology studies inspired curriculum design and classroom research. Science & Education, 6(4), 373–396.

    Google Scholar 

  • Sáez-López, J. M., Román-González, M., & Vázquez-Cano, E. (2016). Visual programming languages integrated across the curriculum in elementary school: A two year case study using “Scratch” in five schools. Computers & Education, 97, 129–141.

  • Sanjanaashree, P., & Soman, K. P. (2014). Language learning for visual and auditory learners using scratch toolkit. In Proceedings of the Computer Communication and Informatics (ICCCI), 2014 International Conference on (pp. 1–5). Coimbatore: IEEE.

  • Sentance, S., & Csizmadia, A. (2017). Computing in the curriculum: Challenges and strategies from a teacher’s perspective. Education and Information Technologies, 22(2), 469–495.

    Google Scholar 

  • Sherin, B. L. (2001). A comparison of programming languages and algebraic notation as expressive languages for physics. International Journal of Computers for Mathematical Learning, 6(1), 1–61.

    Google Scholar 

  • Shitut , N. (2018). 5 skills you need to know to become a big data analyst. Retrieved December, 2018 from https://analyticstraining.com/5-skills-need-know-become-big-data-analyst/. Accessed 30 Nov 2018.

  • Singh, A. K. J., Harun, R. N. S. R., & Fareed, W. (2013). Affordances of Wikispaces for collaborative learning and knowledge management. GEMA Online® Journal of Language Studies, 13(3), 79–97.

    Google Scholar 

  • Stein, S., Ware, J., Laboy, J., & Schaffer, H. E. (2013). Improving K-12 pedagogy via a cloud designed for education. International Journal of Information Management, 33(1), 235–241.

    Google Scholar 

  • Stemler, S. (2001). An overview of content analysis. Practical Assessment, Research & Evaluation, 7(17), 137–146.

    Google Scholar 

  • Topping, K. (1998). Peer assessment between students in colleges and universities. Review of Educational Research, 68(3), 249–276.

    Google Scholar 

  • Topping, K. J., & Ehly, S. W. (2001). Peer assisted learning: A framework for consultation. Journal of Educational and Psychological Consultation, 12(2), 113–132.

    Google Scholar 

  • Tsai, Y. C., & Chuang, M. T. (2013). Fostering revision of argumentative writing through structured peer assessment. Perceptual and Motor Skills, 116(1), 210–221.

    Google Scholar 

  • Tsivitanidou, O. E., Zacharia, Z. C., & Hovardas, T. (2011). Investigating secondary school students’ unmediated peer assessment skills. Learning and Instruction, 21(4), 506–519.

    Google Scholar 

  • Utting, I., Cooper, S., Kölling, M., Maloney, J., & Resnick, M. (2010). Alice, greenfoot, and scratch--a discussion. ACM Transactions on Computing Education (TOCE), 10(4), 17.

    Google Scholar 

  • Wagh, A. (2016). Building v/s exploring models: Comparing learning of evolutionary processes through agent-based modelling. Evanston: Northwestern University.

    Google Scholar 

  • Wang, Y., & Jin, B. (2010). The application of SaaS model in network education-take google apps for example. In Education Technology and Computer (ICETC), 2010 2nd International Conference on (Vol. 4, pp. V4-191). IEEE.

  • Wang, Q. Y., Woo, H. L., & Chai, C. S. (2010). Affordances of ICT tools for learning. In C. S. Chai & Q. Y. Wang (Eds.), ICT for self-directed and collaborative learning (pp. 70–79). Singapore: Pearson/Prentice Hall.

    Google Scholar 

  • Wang, H. Y., Huang, I., & Hwang, G. J. (2016a). Comparison of the effects of project-based computer programming activities between mathematics-gifted students and average students. Journal of Computers in Education, 3(1), 33–45.

    Google Scholar 

  • Wang, Y., Liang, Y., Liu, L., & Liu, Y. (2016b). A multi-peer assessment platform for programming language learning: Considering group non-consensus and personal radicalness. Interactive Learning Environments, 24(8), 2011–2031.

    Google Scholar 

  • Wang, X. M., Hwang, G. J., Liang, Z. Y., & Wang, H. Y. (2017). Enhancing students’ computer programming performances, critical thinking awareness and attitudes towards programming: An online peer-assessment attempt. Journal of Educational Technology & Society, 20(4), 58–68.

    Google Scholar 

  • Wilensky, U., & Reisman, K. (2006). Thinking like a wolf, a sheep, or a firefly: Learning biology through constructing and testing computational theories—An embodied modeling approach. Cognition and Instruction, 24(2), 171–209.

    Google Scholar 

  • Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49(3), 33–35.

    Google Scholar 

  • Xia, B. S. (2017). An in-depth analysis of learning goals in higher education: Evidence from the programming education. Journal of Learning Design, 10(2), 25–34.

    Google Scholar 

  • Xiao, Y., & Lucking, R. (2008). The impact of two types of peer assessment on students' performance and satisfaction within a wiki environment. The Internet and Higher Education, 11(3–4), 186–193.

    Google Scholar 

  • Yim, S., Warschauer, M., & Zheng, B. (2016). Google docs in the classroom: A district-wide case study. Teachers College Record, 118(9).

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Roshni Sabarinath.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix 1

1.1 Review criteria for assessing source code

Maximum marks to be granted is 100.

In case of compilation error, NO MARKS WILL BE GRANTED.

Marks will be deducted in case the coding rules are not followed.

Coding Criteria

Deduction of Marks

1. Lack Clarity of Comments

1.1. ABSENCE of comments on top of every source code stating the purpose of the program

2 marks

1.2. ABSENCE of comments indicating the libraries used.

2 marks

1.3. ABSENCE of comments before every block to explain the purpose of the code block

2 marks

1.4. ABSENCE of line comments wherever complex logic has been applied

2 marks

2. Improper or poor layout

2.1. Proper indentation of code is NOT done.

3 marks

2.2 Nested blocks indented using tabs instead of spaces, resulting in incompatibility on different editors

3 marks

2.3. Multiple code statements in a single line.

3 marks

3. Inappropriate Identifier Naming

3.1. Constants NOT in uppercase and words are NOT underscore separated

4 marks

3.2. Variables named as a,b,ii,jj etc.

4 marks

3.3. Function names NOT descriptive of the purpose served by the function e.g. cheapest_product() or highestPaidEmployee()

4 marks

4. Program does not meet the criteria of the problem posed

4.1. The program logic is different from what is asked in the question

5 to 15 marks

4.2. The program is not efficient e.g. a number of unnecessary temporary variables, unnecessary looping etc.

5 to 10 marks

4.3. The interface is not friendly and does not give appropriate prompts to the user

5 to 10 marks

4.4. The error handling has not been take care of

5 to 15 marks

Appendix 2

1.1 Sample peer review comments

figure a

Appendix 3

1.1 A survey on students’ perceptions of the online learning environment

Dear Participant.

The purpose of this survey is to capture students’ perceptions on the use of online learning environment. This survey will take about 15 min.

Your honest response will provide valuable inputs to improve on the quality of teaching and learning in this class. All responses will be kept confidential and no names will be identified in the report of findings.

Thank you.

To what extent do you agree with the following statements?

(5-point LIKERT scale: 1 – Strongly Disagree to 5 – Strongly Agree)

Pedagogical affordance refers to those technology features which improve learning environments and govern how learning activities can be implemented in an educational setting (adapted from Quek and Wang 2014)

In this class, …

1. I use Google drive’s resources for my learning of programing.

2. I use Google forms for my formative assessment of programming.

3. I use Google docs for peer reviewing of codes.

4. I use Google drive for file sharing.

5. I find Google apps useful for my learning of programming in this class.

Social affordance refers to the characteristics of the environment that supports social interaction among its users (adapted from Quek and Wang 2014)

In this class, …

6. I exchange programming ideas with my peers using Google docs.

7. I read my peers’ proposed alternative programming solutions in Google docs.

8. I respond to peer’s comments made in Google docs.

9. I think through my peers’ comments shared on Google docs.

10. I find my peers’ comments in Google docs to be in improving my programming.

Technical affordance refers to the usability of the environment for learning and task execution (adapted from Quek and Wang 2014)

In this class, …

11. I find this online environment (i.e. designed using Google apps) easy to use.

12. I find the revision history feature of Google apps useful to track all changes done on any file/folder.

13. I find the comment feature of Google docs useful for posting comments on peers’ codes.

14. I find the file sharing feature (e.g. uploading or downloading files) useful for my learning.

15. I find that there is no technical difficulty using Google apps in this programming class.

Open-ended questions

16. What would you consider the most important benefit(s) of using Google apps in computer science classes?

17. What improvement(s) would you suggest in the use of Google apps in computer science classes?

18. How did you feel knowing that others in the class could access your codes?

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sabarinath, R., Quek, C.L.G. A case study investigating programming students’ peer review of codes and their perceptions of the online learning environment. Educ Inf Technol 25, 3553–3575 (2020). https://doi.org/10.1007/s10639-020-10111-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10639-020-10111-9

Keywords

Navigation