Abstract
With the ongoing development of online education platforms, knowledge tracing (KT) has become a critical task that can help online education platforms provide personalized education. KT aims to find out students’ knowledge states and predict whether students can correctly answer the question according to their exercise history. However, existing works fail to incorporate question information and ignore some useful contextual information. In this paper, we propose a novel Sequential Self-Attentive model for Knowledge Tracing (SSAKT). SSAKT utilizes question information based on Multidimensional Item Response Theory (MIRT) which can capture the relations between questions and skills. Then SSAKT uses a self-attention layer to capture the relations between questions. Unlike traditional self-attention networks, the self-attention layer in SSAKT uses Long Short-Term Memory networks (LSTM) to perform positional encoding. Moreover, a context module is designed to capture the contextual information. Experiments on four real-world datasets show that SSAKT outperforms existing KT models. We also conduct a case study that shows our model can effectively capture the relations between questions and skills.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Source code will be available at https://github.com/zxlzxlzxlzxlzxl/SSAKT.
- 2.
- 3.
- 4.
- 5.
References
Abdelrahman, G., Wang, Q.: Knowledge tracing with sequential key-value memory networks. In: Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 175–184 (2019)
Chen, P., Lu, Y., Zheng, V.W., Pian, Y.: Prerequisite-driven deep knowledge tracing. In: 2018 IEEE International Conference on Data Mining (ICDM), pp. 39–48. IEEE (2018)
Choi, Y., et al.: EdNet: a large-scale hierarchical dataset in education. In: Bittencourt, I.I., Cukurova, M., Muldner, K., Luckin, R., Millán, E. (eds.) AIED 2020. LNCS (LNAI), vol. 12164, pp. 69–73. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-52240-7_13
Ghosh, A., Heffernan, N., Lan, A.S.: Context-aware attentive knowledge tracing. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 2330–2339 (2020)
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 770–778 (2016)
Li, H., Min, M.R., Ge, Y., Kadav, A.: A context-aware attention network for interactive question answering. In: Proceedings of the 23rd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 927–935 (2017)
Liu, Q., et al.: EKT: exercise-aware knowledge tracing for student performance prediction. IEEE Trans. Knowl. Data Eng. 33(1), 100–115 (2019)
Van der Maaten, L., Hinton, G.: Visualizing data using t-SNE. J. Mach. Learn. Res. 9(11), 79 (2008)
Martínez-Plumed, F., Prudêncio, R.B., Martínez-Usó, A., Hernández-Orallo, J.: Making sense of item response theory in machine learning. In: Proceedings of the Twenty-second European Conference on Artificial Intelligence, pp. 1140–1148 (2016)
Nagatani, K., Zhang, Q., Sato, M., Chen, Y.Y., Chen, F., Ohkuma, T.: Augmenting knowledge tracing by considering forgetting behavior. In: The World Wide Web Conference, pp. 3101–3107 (2019)
Pandey, S., Karypis, G.: A self-attentive model for knowledge tracing. arXiv preprint arXiv:1907.06837 (2019)
Pandey, S., Srivastava, J.: RKT: relation-aware self-attention for knowledge tracing. In: Proceedings of the 29th ACM International Conference on Information and Knowledge Management, pp. 1205–1214 (2020)
Piech, C., et al.: Deep knowledge tracing. arXiv preprint arXiv:1506.05908 (2015)
Pojen, C., Mingen, H., Tzuyang, T.: Junyi academy online learning activity dataset: a large-scale public online learning activity dataset from elementary to senior high school students (2020). Dataset available from https://www.kaggle.com/junyiacademy/learning-activity-public-dataset-by-junyi-academy
Reckase, M.D.: The past and future of multidimensional item response theory. Appl. Psychol. Meas. 21(1), 25–36 (1997)
Vaswani, A., et al.: Attention is all you need. arXiv preprint arXiv:1706.03762 (2017)
Yang, B., Li, J., Wong, D.F., Chao, L.S., Wang, X., Tu, Z.: Context-aware self-attention networks. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 33, pp. 387–394 (2019)
Yeung, C.K., Yeung, D.Y.: Addressing two problems in deep knowledge tracing via prediction-consistent regularization. In: Proceedings of the Fifth Annual ACM Conference on Learning at Scale, pp. 1–10 (2018)
Zhang, J., Shi, X., King, I., Yeung, D.Y.: Dynamic key-value memory networks for knowledge tracing. In: Proceedings of the 26th International Conference on World Wide Web, pp. 765–774 (2017)
Acknowledgements
This work is partially supported by National Natural Science Foundation of China Nos. U1811263, 62072349, National Key Research and Development Project of China No. 2020YFC1522602.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2021 Springer Nature Switzerland AG
About this paper
Cite this paper
Zhang, X., Zhang, J., Lin, N., Yang, X. (2021). Sequential Self-Attentive Model for Knowledge Tracing. In: Farkaš, I., Masulli, P., Otte, S., Wermter, S. (eds) Artificial Neural Networks and Machine Learning – ICANN 2021. ICANN 2021. Lecture Notes in Computer Science(), vol 12891. Springer, Cham. https://doi.org/10.1007/978-3-030-86362-3_26
Download citation
DOI: https://doi.org/10.1007/978-3-030-86362-3_26
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-86361-6
Online ISBN: 978-3-030-86362-3
eBook Packages: Computer ScienceComputer Science (R0)