Abstract
Recently, working memory models have been used in knowledge tracing to improve prediction performance, which is a cognitive system with limited processing capacity enabling short-term storage of information. However, existing versions do not model limited capacity adequately or consider individual differences in processing capacity. To resolve these problems, we simulate working memory for knowledge tracking (MCKT) by utilizing a Convolutional Neural Network (CNN). To this end, multiple CNNs with convolution kernels of different scales are used to process a single interaction sequence, where each scale of convolution kernels represents a certain processing capacity of the current learner. MCKT utilizes a multi-scale convolution kernel approach to achieve personalized simulation of processing capacities of different learners. Moreover, long-term memory is simulated to predict future learner responses. The results of experiments on several benchmark datasets from the real world reveals that MCKT performs better than multiple classical models.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Baddeley, A.D., Hitch, G.: Working memory. In: Psychology of Learning and Motivation, vol. 8, pp. 47–89. Elsevier (1974)
Corbett, A.T., Anderson, J.R.: Knowledge tracing: Modeling the acquisition of procedural knowledge. User Model. User-Adap. Inter. 4(4), 253–278 (1994)
Cowan, N.: What are the differences between long-term, short-term, and working memory? Prog. Brain Res. 169, 323–338 (2008)
Dauphin, Y.N., Fan, A.: Language modeling with gated convolutional networks. In: Proceedings of the 34th International Conference on Machine Learning, ICML 2017, vol. 70, pp. 933–941. JMLR.org (2017)
Diamond, A.: Executive functions. Annu. Rev. Psychol. 64(1), 135–168 (2013)
Ghosh, A., Heffernan, N., Lan, A.S.: Context-aware attentive knowledge tracing. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 2330–2339 (2020)
Graves, A., Wayne, G., Reynolds, M.: Hybrid computing using a neural network with dynamic external memory. Nature 538(7626), 471–476 (2016)
Huang, T., Yang, H., Li, Z.: A dynamic knowledge diagnosis approach integrating cognitive features. IEEE Access 9, 116814–116829 (2021)
Huang, Z., Liu, Q., Chen, Y.: Learning or forgetting? a dynamic approach for tracking the knowledge proficiency of students. ACM Trans. Inf. Syst. (TOIS) 38, 1–33 (2020)
Kingma, D., Ba, J.: Adam: a method for stochastic optimization. Computer Science (2014)
Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. Commun. ACM 60(6), 84–90 (2017)
Liu, S., Zou, R., Sun, J.: A hierarchical memory network for knowledge tracing. Expert Syst. Appl. 177, 114935 (2021)
Miller, G.A.: The magical number seven, plus-or-minus two or some limits on our capacity for processing information, pp. 175–202. University of California Press (2020)
Nagatani, K., Zhang, Q., Sato, M.: Augmenting knowledge tracing by considering forgetting behavior. In: The World Wide Web Conference, pp. 3101–3107 (2019)
Nair, V., Hinton, G.E.: Rectified linear units improve restricted boltzmann machines. In: Proceedings of the 27th International Conference on International Conference on Machine Learning, ICML 2010, Omnipress, Madison, WI, USA, pp. 807–814 (2010)
Nguyen, T.: The effectiveness of online learning: beyond no significant difference and future horizons. J. Online Teach. Learn. 11, 309–319 (2015)
Piech, C., Bassen, J., Huang, J.: Deep knowledge tracing. Adv. Neural. Inf. Process. Syst. 28, 505–513 (2015)
Romero, C., Ventura, S.: Educational data mining: A review of the state of the art. IEEE Trans. Syst. Man Cybern. Part C 40(6), 601–618 (2010)
Seretis, A., Sarris, C.D.: An overview of machine learning techniques for radiowave propagation modeling. IEEE Trans. Antennas Propag. 70(6), 3970–3985 (2021)
Shah, P., Miyake, A.: Models of Working Memory: An Introduction, pp. 1–27. Cambridge University Press, Cambridge (1999)
Shelhamer, E., Long, J.: Fully convolutional networks for semantic segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 39(4), 640–651 (2017)
Shen, S., Liu, Q., Chen, E., Wu, H.: Convolutional knowledge tracing: Modeling individualization in student learning process. In: Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 1857–1860 (2020)
Zhang, J., Shi, X.: Dynamic key-value memory networks for knowledge tracing. In: Proceedings of the 26th International Conference on World Wide Web, pp. 765–774 (2017)
Hu, J., Shen, L., Sun, G.: Squeeze-and-excitation networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 7132–7141 (2018)
Vaswani, A., Shazeer, N., Parmar, N.: Attention is all you need. In: Advances in Neural Information Processing Systems, pp. 5998–6008 (2017)
Acknowledgements
This research was supported by the National Natural Science Foundation of China (Grant No. 61977033), the State Key Program of National Natural Science of China (Grant No. U20A20229), and the Central China Normal University National Teacher Development Collaborative Innovation Experimental Base Construction Research Project (Grant No. CCNUTEIII 2021-03).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Yang, H., Chen, B., Hu, J., Huang, T., Geng, J., Tang, L. (2023). Modeling Working Memory Using Convolutional Neural Networks for Knowledge Tracing. In: Huang, DS., Premaratne, P., Jin, B., Qu, B., Jo, KH., Hussain, A. (eds) Advanced Intelligent Computing Technology and Applications. ICIC 2023. Lecture Notes in Computer Science, vol 14087. Springer, Singapore. https://doi.org/10.1007/978-981-99-4742-3_11
Download citation
DOI: https://doi.org/10.1007/978-981-99-4742-3_11
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-4741-6
Online ISBN: 978-981-99-4742-3
eBook Packages: Computer ScienceComputer Science (R0)