Skip to main content

Advertisement

Log in

MultiHop attention for knowledge diagnosis of mathematics examination

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Intelligent educational diagnosis can effectively promote the development of artificial intelligence in education. The knowledge diagnosis of specific domains (e.g., mathematics, physics) plays an important role in intelligent educational diagnosis but typically relies on complex semantic information. Most existing methods only produce single sentence representations that have difficulty detecting multiple knowledge points from text. The resources of knowledge point diagnosis of specific domains are also relatively sparse. In this study, we build a dataset about mathematics that is collected from real mathematical examination and artificially annotated 18 knowledge points. We also propose the MultiHop Attention mechanism (MHA) model to focus on different important information in mathematical questions using a multiple attention mechanism. Each attention mechanism obtains different attention weights for different parts of mathematical questions. The MHA allows us to effectively obtain a comprehensive semantic representation of mathematical questions. Additionally, because the ALBERT model is advanced and efficient, we use it for word embedding in this study. The proposed method synthetically considers multiple keywords related to knowledge points in mathematical questions for knowledge diagnosis research. Experimental results with the proposed mathematical dataset show that MHA achieves marked improvements compared to existing methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Dibello LV, Roussos LA, Stout W (2006) A review of cognitively diagnostic assessment and a summary of psychometric models. Handb Stat 26(06):979–1030

    Article  MATH  Google Scholar 

  2. Song K, Ji M, Park S, Moon I-C (2019) Hierarchical context enabled recurrent neural network for recommendation. In: Proceedings of the AAAI conference on artificial intelligence, vol 33. pp 4983–4991

  3. Lan Z, Chen M, Goodman S, Gimpel K, Sharma P, Soricut R (2019) Albert: a lite bert for self-supervised learning of language representations International conference on learning representations

  4. Chen Y, Liu Q, Huang Z, Wu L, Chen E, Wu R, Su Y, Hu G (2017) Tracking knowledge proficiency of students with educational priors. In: Proceedings of the 2017 ACM on conference on information and knowledge management, pp 989–998

  5. De La Torre J (2009) Dina model and parameter estimation: a didactic. J Educ Behav Stat 34 (1):115–130

    Article  Google Scholar 

  6. Embretson SE, Reise SP (2013) Item response theory. Psychology Press, New York, p 384

    Book  Google Scholar 

  7. Liu Q, Wu R, Chen E, Xu G, Su Y, Chen Z, Hu G (2018) Fuzzy cognitive diagnosis for modelling examinee performance. ACM Trans Intell Syst Technol (TIST) 9(4):1–26

    Article  Google Scholar 

  8. Wang F, Liu Q, Chen E, Huang Z, Chen Y, Yin Y, Huang Z, Wang S (2020) Neural cognitive diagnosis for intelligent education systems. In: Proceedings of the AAAI conference on artificial intelligence, vol 34. pp 6153–6161

  9. Jiang P, Wang X (2020) Preference cognitive diagnosis for student performance prediction. IEEE Access 8:219775–219787

    Article  Google Scholar 

  10. Huang T, Yang H, Li Z, Xie H, Geng J, Zhang H (2021) A dynamic knowledge diagnosis approach integrating cognitive features. IEEE Access 9:116814–116829

    Article  Google Scholar 

  11. Gao L, Zhao Z, Li C, Zhao J, Zeng Q (2022) Deep cognitive diagnosis model for predicting students’ performance. Future Gener Comput Syst 126:252–262

    Article  Google Scholar 

  12. Mao Y, Xu B, Yu J, Fang Y, Yuan J, Li J, Hou L (2021) Learning behavior-aware cognitive diagnosis for online education systems. In: International conference of pioneering computer scientists, engineers and educators, Springer, pp 385–398

  13. Devlin J, Chang M-W, Lee K, Toutanova K (2019) Bert: pre-training of deep bidirectional transformers for language understanding. In: Proceedings of the 2019 conference of the North American chapter of the association for computational linguistics: human language technologies, Volume 1 (Long and Short Papers), pp 4171–4186

  14. Choi H, Kim J, Joe S, Gwon Y (2021) Evaluation of bert and albert sentence embedding performance on downstream nlp tasks. In: 2020 25th international conference on pattern recognition (ICPR), IEEE, pp 5482–5487

  15. Liu W, Pang J, Li N, Zhou X, Yue F (2021) Research on multi-label text classification method based on talbert-cnn. Int J Comput Intell Syst 14(1):1–12

    Article  Google Scholar 

  16. Tran NK, Niedereée C (2018) Multihop attention networks for question answer matching. In: The 41st international ACM SIGIR conference on research & development in information retrieval, pp 325–334

  17. Zhang T, Lin H, Ren Y, Yang L, Xu B, Yang Z, Wang J, Zhang Y (2019) Adverse drug reaction detection via a multihop self-attention mechanism. BMC Bioinforma 20(1):1–11

    Article  Google Scholar 

  18. Qian L, Wang J, Lin H, Yang L, Zhang Y (2022) Multi-hop interactive attention based classification network for expert recommendation. Neurocomputing 488:436–443

    Article  Google Scholar 

  19. Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser Ł, Polosukhin I (2017) Attention is all you need. In: Advances in neural information processing systems, pp 5998–6008

  20. Graves A (2012) Long short-term memory. Springer, Berlin Heidelberg

    Book  Google Scholar 

  21. You R, Zhang Z, Wang Z, Dai S, Mamitsuka H, Zhu S (2019) Attentionxml: label tree-based attention-aware deep model for high-performance extreme multi-label text classification. Adv Neural Inf Process Syst 32:

  22. Wu EH-K, Chen S-E, Liu J-J, Ou Y-Y, Sun M-T (2020) A self-relevant cnn-svm model for problem classification in k-12 question-driven learning. IEEE Access 8:225822– 225830

    Article  Google Scholar 

  23. Kim Y (2014) Convolutional neural networks for sentence classification. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp 1746–1751

  24. Chen G, Ye D, Xing Z, Chen J, Cambria E (2017) Ensemble application of convolutional and recurrent neural networks for multi-label text categorization. In: 2017 international joint conference on neural networks (IJCNN), IEEE, pp 2377–2383

  25. Yang P, Sun X, Li W, Ma S, Wu W, Wang H (2018) Sgm: sequence generation model for multi-label classification. In: Proceedings of the 27th international conference on computational linguistics, pp 3915–3926

  26. Pal A, Selvakumar M, Sankarasubbu M (2020) Magnet: multi-label text classification using attention-based graph neural network. In: Proceedings of the 12th international conference on agents and artificial intelligence (ICAART 2020), pp 494–505

  27. Mikolov T, Chen K, Corrado G, Dean J (2013) Efficient estimation of word representations in vector space. Comput Sci

  28. Bahdanau D, Cho KH, Bengio Y (2015) Neural machine translation by jointly learning to align and translate. In: 3Rd international conference on learning representations (ICLR 2015)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tongxuan Zhang.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Xinyu He and Guiyun Zhang are contributed equally to this work.

Rights and permissions

Springer Nature or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

He, X., Zhang, T. & Zhang, G. MultiHop attention for knowledge diagnosis of mathematics examination. Appl Intell 53, 10636–10646 (2023). https://doi.org/10.1007/s10489-022-04033-x

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-022-04033-x

Keywords

Navigation