skip to main content
10.1145/2157136.2157250acmconferencesArticle/Chapter ViewAbstractPublication PagessigcseConference Proceedingsconference-collections
research-article

The impact of question generation activities on performance

Published: 29 February 2012 Publication History

Abstract

Recent interest in student-centric pedagogies have resulted in the development of numerous tools that support student generated questions. Previous evaluations of such tools have reported strong correlations between student participation and exam performance, yet the level of student engagement with other learning activities in the course is a potential confounding factor. We show such correlations may be explained by other factors, and we undertake a deeper analysis that reveals evidence of the positive impact question-generation activities have on student performance.

References

[1]
M. E. D. A. André and T. H. Anderson. The development and evaluation of a self-questioning study technique. Reading Research Quarterly, 14(4):605--623, 1979.
[2]
M. Barak and S. Rafaeli. On-line question-posing and peer-assessment as means for web-based knowledge sharing in learning. International Journal of Human-Computer Studies, 61:84--103, 2004.
[3]
B. Collis and J. Moonen. An on-going journey: Technology as a learning workbench. University of Twente, Enschede, The Netherlands., 2005.
[4]
P. R. Denner and J. P. Rickards. A developmental comparison of the effects of provided and generated questions on text recall. Contemporary Educational Psychology, 12(2):135 -- 146, 1987.
[5]
P. Denny, B. Hanks, and B. Simon. Peerwise: replication study of a student-collaborative self-testing web service in a U.S. setting. In SIGCSE '10: Proceedings of the 41st ACM technical symposium on Computer science education, pages 421--425, New York, NY, USA, 2010. ACM.
[6]
P. Denny, A. Luxton-Reilly, and J. Hamer. Student use of the PeerWise system. In ITiCSE '08: Proceedings of the 13th annual SIGCSE conference on Innovation and Technology in Computer Science Education, pages 73--77, Madrid, Spain, 2008. ACM.
[7]
P. Denny, A. Luxton-Reilly, and B. Simon. Evaluating a new exam question: Parsons problems. In ICER '08: Proceeding of the fourth international workshop on Computing education research, pages 113--124, New York, NY, USA, 2008. ACM.
[8]
P. Denny, A. Luxton-Reilly, E. Tempero, and J. Hendrickx. Codewrite: supporting student-driven practice of java. In Proceedings of the 42nd ACM technical symposium on Computer science education, SIGCSE '11, pages 471--476, New York, NY, USA, 2011. ACM.
[9]
P. W. Foos. Effects of student-written questions on student test performance. Teaching of Psychology, 16(2):77--78, 1989.
[10]
P. W. Foos, J. J. Mora, and S. Tkacz. Student study techniques and the generation effect. Journal of Educational Psychology, 86(4):567--576, Dec 1994.
[11]
L. T. Frase and B. J. Schwartz. Effect of question production and answering on prose recall. Journal of Educational Psychology, 67(5):628--635, 1975.
[12]
J. Hamer, H. C. Purchase, A. Luxton-Reilly, and J. Sheard. Tools for "contributing student learning". In Proceedings of the 2010 ITiCSE working group reports on Working group reports, ITiCSE-WGR '10, pages 1--14, New York, NY, USA, 2010. ACM.
[13]
Y. Hirai and A. Hazeyama. A learning support system based on question-posing and its evaluation. Creating, Connecting and Collaborating through Computing, International Conference on, 0:178--184, 2007.
[14]
J. R. Lehman and K. M. Lehman. The relative effects of experimenter and subject generated questions on learning from museum case exhibits. Journal of Research in Science Teaching, 21(9):931--935, Dec. 1984.
[15]
A. Luxton-Reilly and P. Denny. Constructive evaluation: a pedagogy of student-contributed assessment. Computer Science Education, 20:145--167, 2010.
[16]
A. Luxton-Reilly, P. Denny, B. Plimmer, and D. Bertinshaw. Supporting student-generated free-response questions. In Proceedings of the 16th annual joint conference on Innovation and technology in computer science education, ITiCSE '11, pages 153--157, New York, NY, USA, 2011. ACM.
[17]
T. C. Reeves. Design research from a technology perspective. In J. van den Akker, K. Gravemeijer, S. McKenney, and N. Nieveen, editors, Education Design Research, pages 52--66. Routledge, 2006.
[18]
B. Rosenshine, C. Meister, and S. Chapman. Teaching students to generate questions: A review of the intervention studies. Review of Educational Research, 66(2):181--221, Summer 1996.
[19]
E. V. Wilson. Examnet asynchronous learning network: augmenting face-to-face courses with student-developed exam questions. Computers & Education, 42(1):87 -- 107, 2004.
[20]
F.-Y. Yu. Scaffolding student-generated questions: Design and development of a customizable online learning system. Computers in Human Behaviour, 25:1129--1138, 2009.
[21]
F.-Y. Yu, Y.-H. Liu, and T.-W. Chan. A web-based learning system for question-posing and peer assessment. Innovations in Education and Teaching International, 42(4):337--348, 2005.

Cited By

View all
  • (2023)Computing Education Research in AustralasiaPast, Present and Future of Computing Education Research10.1007/978-3-031-25336-2_17(373-394)Online publication date: 5-Jan-2023
  • (2022)Analytics of learning tactics and strategies in an online learnersourcing environmentJournal of Computer Assisted Learning10.1111/jcal.1272939:1(94-112)Online publication date: 31-Aug-2022
  • (2021)What's In It for the Learners? Evidence from a Randomized Field Experiment on Learnersourcing Questions in a MOOCProceedings of the Eighth ACM Conference on Learning @ Scale10.1145/3430895.3460142(221-233)Online publication date: 8-Jun-2021
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
SIGCSE '12: Proceedings of the 43rd ACM technical symposium on Computer Science Education
February 2012
734 pages
ISBN:9781450310987
DOI:10.1145/2157136
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 29 February 2012

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. constructive evaluation
  2. contributing student pedagogy
  3. free-response
  4. question generation
  5. question posing
  6. student generated
  7. studysieve
  8. user generated content

Qualifiers

  • Research-article

Conference

SIGCSE '12
Sponsor:
SIGCSE '12: The 43rd ACM Technical Symposium on Computer Science Education
February 29 - March 3, 2012
North Carolina, Raleigh, USA

Acceptance Rates

SIGCSE '12 Paper Acceptance Rate 100 of 289 submissions, 35%;
Overall Acceptance Rate 1,787 of 5,146 submissions, 35%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)19
  • Downloads (Last 6 weeks)0
Reflects downloads up to 03 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2023)Computing Education Research in AustralasiaPast, Present and Future of Computing Education Research10.1007/978-3-031-25336-2_17(373-394)Online publication date: 5-Jan-2023
  • (2022)Analytics of learning tactics and strategies in an online learnersourcing environmentJournal of Computer Assisted Learning10.1111/jcal.1272939:1(94-112)Online publication date: 31-Aug-2022
  • (2021)What's In It for the Learners? Evidence from a Randomized Field Experiment on Learnersourcing Questions in a MOOCProceedings of the Eighth ACM Conference on Learning @ Scale10.1145/3430895.3460142(221-233)Online publication date: 8-Jun-2021
  • (2021)Rethinking the Carrot and the Stick: A Case Study of Non-Grade-Bearing Learning Activities to Enhance Students’ Engagement and AchievementNew Zealand Journal of Educational Studies10.1007/s40841-021-00197-1Online publication date: 16-Apr-2021
  • (2021)Effects of Question Type Presentation on Raised Questions in a Video Learning FrameworkDiversity, Divergence, Dialogue10.1007/978-3-030-71305-8_9(114-126)Online publication date: 19-Mar-2021
  • (2020)QMaps: Engaging Students in Voluntary Question Generation and LinkingProceedings of the 2020 CHI Conference on Human Factors in Computing Systems10.1145/3313831.3376882(1-14)Online publication date: 21-Apr-2020
  • (2019)Does Creating Programming Assignments with Tests Lead to Improved Performance in Writing Unit Tests?Proceedings of the ACM Conference on Global Computing Education10.1145/3300115.3309516(106-112)Online publication date: 9-May-2019
  • (2018)A Comparison of the Effectiveness of Two Computer-Based Learning AidsFrontiers in Education10.3389/feduc.2018.000513Online publication date: 9-Jul-2018
  • (2017)Examining a Student-Generated Question Activity Using Random Topic AssignmentProceedings of the 2017 ACM Conference on Innovation and Technology in Computer Science Education10.1145/3059009.3059033(146-151)Online publication date: 28-Jun-2017
  • (2016)With a Little Help From My FriendsProceedings of the 2016 ACM Conference on International Computing Education Research10.1145/2960310.2960322(201-209)Online publication date: 25-Aug-2016
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media