skip to main content
10.1145/3456565.3460053acmconferencesArticle/Chapter ViewAbstractPublication PagesiticseConference Proceedingsconference-collections
poster

Assessing the Cold Start Problem in Adaptive Systems

Published: 26 June 2021 Publication History

Abstract

This study presents a comparison of the following methods for estimating task difficulty in terms of the so-called "cold start" problem - during the initial phase of the introduction of the adaptive educational system to the public. The data originates from the item-based online programming course made available on the RunCode online learning platform. The dataset contains 50055 submissions on 76 tasks uploaded by 299 learners. The reference task difficulty values have been estimated with the Item Response Theory Graded Response Model on the full dataset. Results of this study show, that for the smallest size of the sample n = 5 learners, the Elo rating algorithm achieves a reasonably high correlation of 0.702. For small sizes of the sample n = 5, 10, and 20 learners, the Elo algorithm delivers slightly better estimates than the Glicko rating algorithm and outperforms the proportion correct (PC) method. For the size of the sample n = 50, the Glicko algorithm delivers better results than the Elo, and the difference between algorithms and the proportion correct method decreases.

References

[1]
Pankiewicz, M. 2020. A warm-up for adaptive online learning environments -- the Elo rating approach for assessing the cold start problem. ICCE 2020 - 28th International Conference on Computers in Education (2020), 324--329.
[2]
Pankiewicz, M. and Bator, M. 2019. Elo Rating Algorithm for the Purpose of Measuring Task Difficulty in Online Learning Environments. e-mentor. 82, 5 (2019), 43--51.
[3]
Pelánek, R. 2016. Applications of the Elo rating system in adaptive educational systems. Computers and Education. 98, (Jul. 2016), 169--179.
[4]
Schein, A.I. et al. 2002. Methods and metrics for cold-start recommendations. Proceedings of the 25th annual international ACM SIGIR conference on Research and development in information retrieval - SIGIR '02 (New York, New York, USA, 2002), 253.
[5]
Wauters, K. et al. 2012. Item difficulty estimation: An auspicious collaboration between data and judgment. Computers and Education. 58, 4 (May 2012), 1183--1193.

Cited By

View all
  • (2024)Synthetic Students: A Comparative Study of Bug Distribution Between Large Language Models and Computing StudentsProceedings of the 2024 on ACM Virtual Global Computing Education Conference V. 110.1145/3649165.3690100(137-143)Online publication date: 5-Dec-2024
  • (2024)Design of Assessment Task Analytics Dashboard Based on Elo Rating in E-AssessmentAssessment Analytics in Education10.1007/978-3-031-56365-2_9(173-188)Online publication date: 8-May-2024
  • (2023)Warming up the Cold Start: Adaptive Step Size Method for the Urnings AlgorithmArtificial Intelligence in Education. Posters and Late Breaking Results, Workshops and Tutorials, Industry and Innovation Tracks, Practitioners, Doctoral Consortium and Blue Sky10.1007/978-3-031-36336-8_64(409-414)Online publication date: 30-Jun-2023

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
ITiCSE '21: Proceedings of the 26th ACM Conference on Innovation and Technology in Computer Science Education V. 2
June 2021
109 pages
ISBN:9781450383974
DOI:10.1145/3456565
Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the Owner/Author.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 26 June 2021

Check for updates

Author Tags

  1. cold start problem
  2. online learning environment
  3. task difficulty

Qualifiers

  • Poster

Conference

ITiCSE 2021
Sponsor:

Acceptance Rates

Overall Acceptance Rate 552 of 1,613 submissions, 34%

Upcoming Conference

ITiCSE '25
Innovation and Technology in Computer Science Education
June 27 - July 2, 2025
Nijmegen , Netherlands

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)13
  • Downloads (Last 6 weeks)1
Reflects downloads up to 02 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2024)Synthetic Students: A Comparative Study of Bug Distribution Between Large Language Models and Computing StudentsProceedings of the 2024 on ACM Virtual Global Computing Education Conference V. 110.1145/3649165.3690100(137-143)Online publication date: 5-Dec-2024
  • (2024)Design of Assessment Task Analytics Dashboard Based on Elo Rating in E-AssessmentAssessment Analytics in Education10.1007/978-3-031-56365-2_9(173-188)Online publication date: 8-May-2024
  • (2023)Warming up the Cold Start: Adaptive Step Size Method for the Urnings AlgorithmArtificial Intelligence in Education. Posters and Late Breaking Results, Workshops and Tutorials, Industry and Innovation Tracks, Practitioners, Doctoral Consortium and Blue Sky10.1007/978-3-031-36336-8_64(409-414)Online publication date: 30-Jun-2023

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media