skip to main content
10.1145/3544109.3544167acmotherconferencesArticle/Chapter ViewAbstractPublication PagesipecConference Proceedingsconference-collections
research-article

Application Research of Attention Mechanism in Machine Reading Comprehension

Published: 18 July 2022 Publication History

Abstract

In recent years, machine reading comprehension has become one of the latest and most popular topics in natural language processing, attention mechanism has been widely used as an important method for extracting relevant information from articles in machine reading comprehension. This paper aims to summarize the development process of the attention mechanism and its application in machine reading comprehension. On the basis of the introduction of the derivation process of the attention mechanism, the network framework and the weight calculation method of the input data, it further introduces three kinds of attention-based Mechanisms for machine reading comprehension models. Finally, the future development trend of the attention mechanism in the field of machine reading comprehension is analyzed. The attention mechanism can help the model to extract important information and make the model make more accurate judgments. In the future, it will be more widely used in various tasks of machine reading comprehension.

References

[1]
Xu Lili, Li Ru, Li Yuexiang, Sentence filling answer selection method for machine reading comprehension. Computer Engineering, 2018, 44(7):6.
[2]
Shen Xiangxiang, Hou Xinwen, Yin Chuanhuan. Research on State Attention Mechanism in Deep Reinforcement Learning. Journal of Intelligent Systems, 2020, 15(2): 317-322.
[3]
Ren Huan, Wang Xuguang. A Review of Attention Mechanisms. Computer Applications, 2021, 41(S01):6.
[4]
Wang Qi. Research on Neural Machine Translation Based on Attention Convolution. Soochow University, 2019.
[5]
Ji Zhongxian. Research on machine reading comprehension method based on attention mechanism. Harbin Institute of Technology, 2019.
[6]
Yang Zhiming, Shi Yingcheng, Wang Yong, A Reading Comprehension Model Based on BiDAF Multi-Document Reordering. Journal of Chinese Information, 2018, 32(11):11.
[7]
Ji Jing. Research on Machine Reading Comprehension Integrating Multiple Semantic Alignment Representations. Nanjing Normal University, 2018.
[8]
Liang Xiaobo, Ren Feiliang, Liu Yongkang, N-Reader: A Machine Reading Comprehension Model Based on Double Self-attention. 2022(10).
[9]
Gu Jianwei, Zeng Cheng, Zou Encen, Machine reading comprehension based on the combination of bidirectional attention flow and self-attention. Journal of Nanjing University: Natural Science Edition, 2019, 55(1):8.
[10]
Shi Lei, Wang Yi, Cheng Ying, etc. A review of attention mechanism research in natural language processing. Data Analysis and Knowledge Discovery, 2020, 4(5):1-14.
[11]
Vaswani, A., Shazeer, N.M., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, L., & Polosukhin, I. (2017). Attention is All you Need. ArXiv, abs/1706.03762.
[12]
Liu Weijie. Research on Machine Reading Comprehension Technology Based on Attention Mechanism. Beijing University of Posts and Telecommunications, 2019.
  1. Application Research of Attention Mechanism in Machine Reading Comprehension

    Recommendations

    Comments

    Information & Contributors

    Information

    Published In

    cover image ACM Other conferences
    IPEC '22: Proceedings of the 3rd Asia-Pacific Conference on Image Processing, Electronics and Computers
    April 2022
    1065 pages
    ISBN:9781450395786
    DOI:10.1145/3544109
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 18 July 2022

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Attention mechanism
    2. Machine reading comprehension model
    3. Weight information

    Qualifiers

    • Research-article
    • Research
    • Refereed limited

    Funding Sources

    Conference

    IPEC2022

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 31
      Total Downloads
    • Downloads (Last 12 months)5
    • Downloads (Last 6 weeks)0
    Reflects downloads up to 05 Mar 2025

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Figures

    Tables

    Media

    Share

    Share

    Share this Publication link

    Share on social media