Skip to main content

Modeling Working Memory Using Convolutional Neural Networks for Knowledge Tracing

  • Conference paper
  • First Online:
Advanced Intelligent Computing Technology and Applications (ICIC 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14087))

Included in the following conference series:

Abstract

Recently, working memory models have been used in knowledge tracing to improve prediction performance, which is a cognitive system with limited processing capacity enabling short-term storage of information. However, existing versions do not model limited capacity adequately or consider individual differences in processing capacity. To resolve these problems, we simulate working memory for knowledge tracking (MCKT) by utilizing a Convolutional Neural Network (CNN). To this end, multiple CNNs with convolution kernels of different scales are used to process a single interaction sequence, where each scale of convolution kernels represents a certain processing capacity of the current learner. MCKT utilizes a multi-scale convolution kernel approach to achieve personalized simulation of processing capacities of different learners. Moreover, long-term memory is simulated to predict future learner responses. The results of experiments on several benchmark datasets from the real world reveals that MCKT performs better than multiple classical models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    https://sites.google.com/site/assistmentsdata/home and https://sites.google.com/view/assistmentsdatamining/.

  2. 2.

    https://pslcdatashop.web.cmu.edu/DatasetInfo?datasetId=507.

References

  1. Baddeley, A.D., Hitch, G.: Working memory. In: Psychology of Learning and Motivation, vol. 8, pp. 47–89. Elsevier (1974)

    Google Scholar 

  2. Corbett, A.T., Anderson, J.R.: Knowledge tracing: Modeling the acquisition of procedural knowledge. User Model. User-Adap. Inter. 4(4), 253–278 (1994)

    Article  Google Scholar 

  3. Cowan, N.: What are the differences between long-term, short-term, and working memory? Prog. Brain Res. 169, 323–338 (2008)

    Article  Google Scholar 

  4. Dauphin, Y.N., Fan, A.: Language modeling with gated convolutional networks. In: Proceedings of the 34th International Conference on Machine Learning, ICML 2017, vol. 70, pp. 933–941. JMLR.org (2017)

    Google Scholar 

  5. Diamond, A.: Executive functions. Annu. Rev. Psychol. 64(1), 135–168 (2013)

    Article  Google Scholar 

  6. Ghosh, A., Heffernan, N., Lan, A.S.: Context-aware attentive knowledge tracing. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 2330–2339 (2020)

    Google Scholar 

  7. Graves, A., Wayne, G., Reynolds, M.: Hybrid computing using a neural network with dynamic external memory. Nature 538(7626), 471–476 (2016)

    Article  Google Scholar 

  8. Huang, T., Yang, H., Li, Z.: A dynamic knowledge diagnosis approach integrating cognitive features. IEEE Access 9, 116814–116829 (2021)

    Article  Google Scholar 

  9. Huang, Z., Liu, Q., Chen, Y.: Learning or forgetting? a dynamic approach for tracking the knowledge proficiency of students. ACM Trans. Inf. Syst. (TOIS) 38, 1–33 (2020)

    Google Scholar 

  10. Kingma, D., Ba, J.: Adam: a method for stochastic optimization. Computer Science (2014)

    Google Scholar 

  11. Krizhevsky, A., Sutskever, I., Hinton, G.E.: Imagenet classification with deep convolutional neural networks. Commun. ACM 60(6), 84–90 (2017)

    Article  Google Scholar 

  12. Liu, S., Zou, R., Sun, J.: A hierarchical memory network for knowledge tracing. Expert Syst. Appl. 177, 114935 (2021)

    Article  Google Scholar 

  13. Miller, G.A.: The magical number seven, plus-or-minus two or some limits on our capacity for processing information, pp. 175–202. University of California Press (2020)

    Google Scholar 

  14. Nagatani, K., Zhang, Q., Sato, M.: Augmenting knowledge tracing by considering forgetting behavior. In: The World Wide Web Conference, pp. 3101–3107 (2019)

    Google Scholar 

  15. Nair, V., Hinton, G.E.: Rectified linear units improve restricted boltzmann machines. In: Proceedings of the 27th International Conference on International Conference on Machine Learning, ICML 2010, Omnipress, Madison, WI, USA, pp. 807–814 (2010)

    Google Scholar 

  16. Nguyen, T.: The effectiveness of online learning: beyond no significant difference and future horizons. J. Online Teach. Learn. 11, 309–319 (2015)

    Google Scholar 

  17. Piech, C., Bassen, J., Huang, J.: Deep knowledge tracing. Adv. Neural. Inf. Process. Syst. 28, 505–513 (2015)

    Google Scholar 

  18. Romero, C., Ventura, S.: Educational data mining: A review of the state of the art. IEEE Trans. Syst. Man Cybern. Part C 40(6), 601–618 (2010)

    Article  Google Scholar 

  19. Seretis, A., Sarris, C.D.: An overview of machine learning techniques for radiowave propagation modeling. IEEE Trans. Antennas Propag. 70(6), 3970–3985 (2021)

    Article  Google Scholar 

  20. Shah, P., Miyake, A.: Models of Working Memory: An Introduction, pp. 1–27. Cambridge University Press, Cambridge (1999)

    Google Scholar 

  21. Shelhamer, E., Long, J.: Fully convolutional networks for semantic segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 39(4), 640–651 (2017)

    Article  Google Scholar 

  22. Shen, S., Liu, Q., Chen, E., Wu, H.: Convolutional knowledge tracing: Modeling individualization in student learning process. In: Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 1857–1860 (2020)

    Google Scholar 

  23. Zhang, J., Shi, X.: Dynamic key-value memory networks for knowledge tracing. In: Proceedings of the 26th International Conference on World Wide Web, pp. 765–774 (2017)

    Google Scholar 

  24. Hu, J., Shen, L., Sun, G.: Squeeze-and-excitation networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 7132–7141 (2018)

    Google Scholar 

  25. Vaswani, A., Shazeer, N., Parmar, N.: Attention is all you need. In: Advances in Neural Information Processing Systems, pp. 5998–6008 (2017)

    Google Scholar 

Download references

Acknowledgements

This research was supported by the National Natural Science Foundation of China (Grant No. 61977033), the State Key Program of National Natural Science of China (Grant No. U20A20229), and the Central China Normal University National Teacher Development Collaborative Innovation Experimental Base Construction Research Project (Grant No. CCNUTEIII 2021-03).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tao Huang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yang, H., Chen, B., Hu, J., Huang, T., Geng, J., Tang, L. (2023). Modeling Working Memory Using Convolutional Neural Networks for Knowledge Tracing. In: Huang, DS., Premaratne, P., Jin, B., Qu, B., Jo, KH., Hussain, A. (eds) Advanced Intelligent Computing Technology and Applications. ICIC 2023. Lecture Notes in Computer Science, vol 14087. Springer, Singapore. https://doi.org/10.1007/978-981-99-4742-3_11

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-4742-3_11

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-4741-6

  • Online ISBN: 978-981-99-4742-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics