Abstract
Machine reading comprehension is introduced to improve machines’ readability and understandability of human languages. This sophisticated version of natural language processing is used for testing and improving the machine’s efficiency for reading and responding to the input texts for appropriate queries. In this article, a persistent comprehensive model using query reconstruction is introduced to address the “Cloze Style” issue in text reading. This issue results in multiple output delivery that serves as irrelevant for different input queries. Therefore, the query reconstruction using the possible combination, reducing the aforementioned issue, is introduced in this model. The possible query keywords are replaced using the maximum individual combinations. The combinations are swapped using deep learning through keyword training and substitution processes. This process is persistent until the maximum text output (answer/ response) is obtained from the machine. The output is used for analyzing the understandability of the machine based on which the training intensity is tuned for successive iterations. Therefore, the proposed model scrutinizes output accuracy by reducing errors under controlled combination time.













Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Data availability
The data that support the findings of this study are available from the corresponding author upon reasonable request.
References
Zhao Y, Zhang Z, Zhao H (2022) Reference knowledgeable network for machine reading comprehension. IEEE/ACM Transact on Audio Speech Lang Process 30:1461–1473
Malhas R, Elsayed T (2022) Arabic machine reading comprehension on the Holy Qur’an using CL-AraBERT. Inform Process Manage 59(6):103068
Sun C, Yang Z, Wang L, Zhang Y, Lin H, Wang J (2021) Biomedical named entity recognition using BERT in the machine reading comprehension framework. J Biomed Inform 118:103799
Chang TW, Fan YC, Chen AL (2022) Emotion-cause pair extraction based on machine reading comprehension model. Multimedia Tool Appl 81(28):40653–40673
Cong Y, Wu Y, Liang X, Pei J, Qin Z (2021) PH-model: enhancing multi-passage machine reading comprehension with passage reranking and hierarchical information. Appl Intell 51(8):5440–5452
Lamsiyah S, El Mahdaouy A, Ouatik El Alaoui S, & Espinasse B (2021). Unsupervised query-focused multi-document summarization based on transfer learning from sentence embedding models, BM25 model, and maximal marginal relevance criterion. Journal of Ambient Intelligence and Humanized Computing, 1-18
Zhang Z, Zhang Y, Zhao H (2021) Syntax-aware multi-spans generation for reading comprehension. IEEE/ACM Transact Audio Speech Lang Process 30:260–268
Liu J, Chen Y, Xu J (2022) Document-level event argument linking as machine reading comprehension. Neurocomputing 488:414–423
Flores EP, de Oliveira-Castro JM, de Souza CBA (2020) How to do things with texts: a functional account of reading comprehension. Anal Verbal Behavior 36(2):273–294
Liu Y, Liu D (2020) Morphological awareness and orthographic awareness link Chinese writing to reading comprehension. Read Writ 33(7):1701–1720
Lee HG, Jang Y, Kim H (2021) Machine reading comprehension framework based on self-training for domain adaptation. IEEE Access 9:21279–21285
Gong P, Liu J, Yang Y, He H (2020) Towards knowledge enhanced language model for machine reading comprehension. IEEE Access 8:22483
Peng W, Hu Y, Yu J, Xing L, Xie Y (2021) APER: adaptive evidence-driven reasoning network for machine reading comprehension with unanswerable questions. Knowl-Based Syst 229:107364
Chen J, Hu B, Peng W, Chen Q, Tang B (2022) Biomedical relation extraction via knowledge-enhanced reading comprehension. BMC Bioinform 23(1):1–19
Huang XZ, Tang SL, Zhang Y, Wei BG (2020) Hybrid embedding and joint Training of stacked encoder for opinion question machine reading comprehension. Frontiers Inform Technol Electron Eng 21(9):1346–1355
Liu J, Yu M, Chen Y, Xu J (2022) Cross-domain slot filling as machine reading comprehension: a new perspective. IEEE/ACM Transact Audio Speech Lang Process 30:673–685
Zhu P, Zhang Z, Zhao H, Li X (2021) DUMA: reading comprehension with transposition thinking. IEEE/ACM Transactions on Audio, Speech, and Language Processing 30:269–279
Yang K, Zhang X, Chen D (2021) Exploring machine reading comprehension for continuous questions via subsequent question completion. IEEE Access 9:12622–12634
Ma B, Sun H, Wang J, Qi Q, Liao J (2021) Extractive dialogue summarization without annotation based on distantly supervised machine reading comprehension in customer service. IEEE/ACM Transact Audio Speech Lang Process 30:87–97
Yang Y, Kang S, Seo J (2020) Improved machine reading comprehension using data validation for weakly labeled data. IEEE Access 8:5667–5677
Zhou C, Wang Z, He S, Zhang H, Su J (2022) A novel multi-domain machine reading comprehension model with domain interference mitigation. Neurocomputing 500:791–798
Baradaran R, Amirkhani H (2021) Ensemble learning-based approach for improving generalization capability of machine reading comprehension systems. Neurocomputing 466:229–242
Guo S, Guan Y, Tan H, Li R, Li X (2021) Frame-based neural network for machine reading comprehension. Knowl-Based Syst 219:106889
Jia M, Liao L, Wang W, Li F, Chen Z, Li J, Huang H (2022) Keywords-aware dynamic graph neural network for multi-hop reading comprehension. Neurocomputing 501:25–40
Li F, Shan Y, Mao X, Ren X, Liu X, Zhang S (2022) Multi-task joint training model for machine reading comprehension. Neurocomputing 488:66–77
Ren M, Huang H, Gao Y (2022) Interpretable modular knowledge reasoning for machine reading comprehension. Neural Comput Appl 34(12):9901–9918
Yan H, Liu L, Feng X, Huang Q (2022) Leveraging greater relations for improving multi-choice reading comprehension. Neural Comput Appl 34(23):20851–20864
Liao J, Zhao X, Li X, Tang J, Ge B (2022) Contrastive heterogeneous graphs learning for multi-hop machine reading comprehension. World Wide Web 25(3):1469–1487
Feng J, Sun J, Shao D, Cui J (2022) Improving the robustness of machine reading comprehension via contrastive learning. Appl Intell. https://doi.org/10.1007/s10489-022-03947-w
Funding
This work was supported by National Natural Science Foundation of China (62166018, 62266017).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declared no potential conflicts of interest with respect to the research, authorship, and/or publication of this article.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Wang, P., Kamruzzaman, M.M. & Chen, Q. Machine reading comprehension model based on query reconstruction technology and deep learning. Neural Comput & Applic 36, 2155–2170 (2024). https://doi.org/10.1007/s00521-023-08698-4
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00521-023-08698-4