Abstract:
Cross-lingual neural headline generation (CNHG), which aims at training a single, large neural network that directly generates a target language headline given a source l...Show MoreMetadata
Abstract:
Cross-lingual neural headline generation (CNHG), which aims at training a single, large neural network that directly generates a target language headline given a source language news document, has received considerable attention in recent years. Unlike conventional neural headline generation, CNHG faces the problem that there are no large-scale parallel corpora of source language articles and target language headlines. Consequently, CNHG is a zero-shot scenario. To solve this problem, we propose zero resource CNHG with reinforcement learning. We develop a reinforcement learning framework that is composed of two modules: a neural machine translation (NMT) module and a CNHG module. The translation module translates an input document into a source language document, and the headline generation module takes the previous output as input to generate a target language headline. Then, both modules receive a reward for joint training. The experimental results reveal that our method significantly outperforms baseline models.
Published in: IEEE/ACM Transactions on Audio, Speech, and Language Processing ( Volume: 28)