skip to main content
10.1145/3446871.3469781acmconferencesArticle/Chapter ViewAbstractPublication PagesicerConference Proceedingsconference-collections
abstract

Crowdsourcing in Computer Science Education

Published:17 August 2021Publication History

ABSTRACT

Crowdsourcing is a method of collecting services, ideas, materials or other artefacts from a relatively large and open group of people. Crowdsourcing has been used in computer science education to alleviate the teachers’ workload in creating course content, and as a learning and revision method for students through its use in educational systems. Tools that utilize crowdsourcing can act as a great way for students to further familiarize themselves with the course concepts, all while creating new content for their peers and future course iterations. In my research, I focus on investigating the effects of computing education systems that use crowdsoucing on students’ learning, and the types of quality assurance methods required to use the artefacts students produce with these tools.

References

  1. Paul Denny, Andrew Luxton-Reilly, and John Hamer. 2008. The PeerWise System of Student Contributed Assessment Questions. In Proceedings of the Tenth Conference on Australasian Computing Education - Volume 78 (Wollongong, NSW, Australia) (ACE ’08). Australian Computer Society, Inc., AUS, 69–74.Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. Paul Denny, Andrew Luxton-Reilly, and John Hamer. 2008. Student Use of the PeerWise System. In Proceedings of the 13th Annual Conference on Innovation and Technology in Computer Science Education (Madrid, Spain) (ITiCSE ’08). ACM, New York, NY, USA, 73–77. https://doi.org/10.1145/1384271.1384293Google ScholarGoogle ScholarDigital LibraryDigital Library
  3. Paul Denny, Andrew Luxton-Reilly, Ewan Tempero, and Jacob Hendrickx. 2011. CodeWrite: Supporting Student-Driven Practice of Java. In Proceedings of the 42nd ACM Technical Symposium on Computer Science Education (Dallas, TX, USA) (SIGCSE ’11). Association for Computing Machinery, New York, NY, USA, 471–476. https://doi.org/10.1145/1953163.1953299Google ScholarGoogle ScholarDigital LibraryDigital Library
  4. John Hamer, Helen C. Purchase, Paul Denny, and Andrew Luxton-Reilly. 2009. Quality of Peer Assessment in CS1. In Proceedings of the Fifth International Workshop on Computing Education Research Workshop (Berkeley, CA, USA) (ICER ’09). Association for Computing Machinery, New York, NY, USA, 27–36. https://doi.org/10.1145/1584322.1584327Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. Vilma Kangas, Nea Pirttinen, Henrik Nygren, Juho Leinonen, and Arto Hellas. 2019. Does Creating Programming Assignments with Tests Lead to Improved Performance in Writing Unit Tests?. In Proceedings of the ACM Conference on Global Computing Education (Chengdu,Sichuan, China) (CompEd ’19). Association for Computing Machinery, New York, NY, USA, 106–112. https://doi.org/10.1145/3300115.3309516Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. Nea Pirttinen, Vilma Kangas, Irene Nikkarinen, Henrik Nygren, Juho Leinonen, and Arto Hellas. 2018. Crowdsourcing Programming Assignments with CrowdSorcerer. In Proceedings of the 23rd Annual ACM Conference on Innovation and Technology in Computer Science Education(Larnaca, Cyprus) (ITiCSE 2018). Association for Computing Machinery, New York, NY, USA, 326–331. https://doi.org/10.1145/3197091.3197117Google ScholarGoogle ScholarDigital LibraryDigital Library
  7. Nea Pirttinen, Vilma Kangas, Henrik Nygren, Juho Leinonen, and Arto Hellas. 2018. Analysis of Students’ Peer Reviews to Crowdsourced Programming Assignments. In Proceedings of the 18th Koli Calling International Conference on Computing Education Research(Koli, Finland) (Koli Calling ’18). Association for Computing Machinery, New York, NY, USA, Article 21, 5 pages. https://doi.org/10.1145/3279720.3279741Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. Nea Pirttinen and Juho Leinonen. 2021. Exploring the Complexity of Crowdsourced Programming Assignments. In Seventh SPLICE Workshop at SIGCSE 2021 “CS Education Infrastructure for All III: From Ideas to Practice”. https://cssplice.github.io/SIGCSE21Workshop.htmlGoogle ScholarGoogle Scholar
  1. Crowdsourcing in Computer Science Education

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      ICER 2021: Proceedings of the 17th ACM Conference on International Computing Education Research
      August 2021
      451 pages
      ISBN:9781450383264
      DOI:10.1145/3446871

      Copyright © 2021 Owner/Author

      Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 17 August 2021

      Check for updates

      Qualifiers

      • abstract
      • Research
      • Refereed limited

      Acceptance Rates

      Overall Acceptance Rate189of803submissions,24%

      Upcoming Conference

      ICER 2024
      ACM Conference on International Computing Education Research
      August 13 - 15, 2024
      Melbourne , VIC , Australia

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format .

    View HTML Format