Skip to main content

Abstract

While many learning and assessment models focus on the binary correctness of student responses, previous studies have shown that having access to extra information—such as the time it takes students to respond to a question—can improve the performance of these models. As much of the previous work in this area has focused on knowledge tracing and next answer correctness, in this study we take a different approach and analyze the relationship between these extra types of information and the overall knowledge of the student, as measured by the end result of an adaptive assessment. In addition to looking at student response times, we investigate the benefit of having detailed information on the responses in the form of answer feedback tags from the adaptive assessment system. After using feature embeddings to encode the information from these feedback tags, we build several models and perform a feature importance analysis to compare the relative significance of these different variables. Although it appears that the response time variable does contain useful information, the answer feedback tags are ultimately much more important to the models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    To account for students with multiple assessments, the confidence intervals are computed using the cluster bootstrap method [3].

References

  1. Beck, J.E.: Engagement tracing: Using response times to model student disengagement. In: Artificial Intelligence in Education (2005)

    Google Scholar 

  2. Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)

    Article  MATH  Google Scholar 

  3. Field, C.A., Welsh, A.H.: Bootstrapping clustered data. J. R. Stat. Soc. Ser. B (Stat. Methodol.) 69(3), 369–390 (2007)

    Article  MathSciNet  MATH  Google Scholar 

  4. González-Espada, W.J., Bullock, D.W.: Innovative applications of classroom response systems: investigating students item response times in relation to final course grade, gender, general point average, and high school ACT scores. Electron. J. Integr. Technol. Educ. 6, 97–108 (2007)

    Google Scholar 

  5. Inwegen, E.V., Adjei, S.A., Wang, Y., Heffernan, N.T.: Using partial credit and response history to model user knowledge. In: Educational Data Mining (2015)

    Google Scholar 

  6. Jurafsky, D., Martin, J.H.: Speech and Language Processing (3rd ed. draft) (2021). https://web.stanford.edu/~jurafsky/slp3/

  7. Lin, C., Shen, S., Chi, M.: Incorporating student response time and tutor instructional interventions into student modeling. In: User Modeling Adaptation and Personalization (2016)

    Google Scholar 

  8. Liu, N., Wang, Z., Baraniuk, R.G., Lan, A.: Open-ended knowledge tracing (2022). https://doi.org/10.48550/ARXIV.2203.03716. https://arxiv.org/abs/2203.03716

  9. Matayoshi, J., Uzun, H., Cosyn, E.: Using a randomized experiment to compare the performance of two adaptive assessment engines. In: Educational Data Mining, pp. 821–827 (2022)

    Google Scholar 

  10. Paszke, A., et al.: PyTorch: an imperative style, high-performance deep learning library. CoRR abs/1912.01703 (2019). http://arxiv.org/abs/1912.01703

  11. Pelánek, R.: Bayesian knowledge tracing, logistic models, and beyond: an overview of learner modeling techniques. User Model. User Adap. Inter. 27(3), 313–350 (2017)

    Article  Google Scholar 

  12. Pelánek, R.: Exploring the utility of response times and wrong answers for adaptive learning. In: Learning @ Scale, pp. 1–4 (2018)

    Google Scholar 

  13. Pelánek, R., Effenberger, T.: Beyond binary correctness: classification of students’ answers in learning systems. User Model. User Adap. Inter. 30(5), 867–893 (2020)

    Article  Google Scholar 

  14. Pelánek, R., Rihák, J.: Properties and applications of wrong answers in online educational systems. In: Educational Data Mining (2016)

    Google Scholar 

  15. Piech, C., Huang, J., Nguyen, A., Phulsuksombati, M., Sahami, M., Guibas, L.: Learning program embeddings to propagate feedback on student code. In: International Conference on Machine Learning, pp. 1093–1102. PMLR (2015)

    Google Scholar 

  16. Wang, Y., Heffernan, N.T.: Leveraging first response time into the knowledge tracing model. In: Educational Data Mining (2012)

    Google Scholar 

  17. Wang, Y., Heffernan, N.T.: Extending knowledge tracing to allow partial credit: using continuous versus binary nodes. In: Artificial Intelligence in Education (2013)

    Google Scholar 

  18. Wang, Y., Heffernan, N.T., Beck, J.E.: Representing student performance with partial credit. In: Educational Data Mining (2010)

    Google Scholar 

  19. Wang, Y., Heffernan, N.T., Heffernan, C.: Towards better affect detectors: effect of missing skills, class features and common wrong answers. In: Learning Analytics and Knowledge (2015)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jeffrey Matayoshi .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Matayoshi, J., Uzun, H., Cosyn, E. (2023). Analyzing Response Times and Answer Feedback Tags in an Adaptive Assessment. In: Wang, N., Rebolledo-Mendez, G., Dimitrova, V., Matsuda, N., Santos, O.C. (eds) Artificial Intelligence in Education. Posters and Late Breaking Results, Workshops and Tutorials, Industry and Innovation Tracks, Practitioners, Doctoral Consortium and Blue Sky. AIED 2023. Communications in Computer and Information Science, vol 1831. Springer, Cham. https://doi.org/10.1007/978-3-031-36336-8_46

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-36336-8_46

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-36335-1

  • Online ISBN: 978-3-031-36336-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics