Abstract:
In recent years, research on reading-compr question and answering has drawn intense attention in Language Processing. However, it is still a key issue to the high-level s...Show MoreNotes: As originally published text, pages or figures in the document were missing or not visible. A corrected replacement file was provided by the authors.
Metadata
Abstract:
In recent years, research on reading-compr question and answering has drawn intense attention in Language Processing. However, it is still a key issue to the high-level semantic vector representation of quest paragraph. Drawing inspiration from DrQA [1], wh question and answering system proposed by Facebook, tl proposes an attention-based question and answering 11 adds the binary representation of the paragraph, the par; attention to the question, and the question's attentioi paragraph. Meanwhile, a self-attention calculation m proposed to enhance the question semantic vector reption. Besides, it uses a multi-layer bidirectional Lon: Term Memory(BiLSTM) networks to calculate the h semantic vector representations of paragraphs and q Finally, bilinear functions are used to calculate the pr of the answer's position in the paragraph. The expe results on the Stanford Question Answering Dataset(SQl development set show that the F1 score is 80.1% and tl 71.4%, which demonstrates that the performance of the is better than that of the model of DrQA, since they inc 2% and 1.3% respectively.
Notes: As originally published text, pages or figures in the document were missing or not visible. A corrected replacement file was provided by the authors.
Published in: 2018 IEEE 29th International Conference on Application-specific Systems, Architectures and Processors (ASAP)
Date of Conference: 10-12 July 2018
Date Added to IEEE Xplore: 26 August 2018
ISBN Information:
Electronic ISSN: 2160-052X