Abstract:
Recurrent Neural Networks (RNNs) have shown promising results in many text generation tasks with their ability in modeling complex data distribution. However, the text ge...Show MoreMetadata
Abstract:
Recurrent Neural Networks (RNNs) have shown promising results in many text generation tasks with their ability in modeling complex data distribution. However, the text generation model in their encoder or decoder RNNs still can not use the context efficiently. In this paper, we propose a novel Attention Recurrent Unit (ARU) to generate short descriptive texts conditioned on database records. Different from conventional approaches Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU), ARU allows the context information from the encoder to be aligned first inside the unit, which can improve the ability of content selection and surface realization for the model. And we also design a method called DoubleAtten to enhance the attention distribution when computing the generation probabilities. On the recently released ROTOWIRE dataset, extensive experimental results demonstrate that the ARU and DoubleAtten can efficiently improve the model performance for data-to-text generation task.
Date of Conference: 14-19 July 2019
Date Added to IEEE Xplore: 30 September 2019
ISBN Information: