Loading [MathJax]/extensions/MathZoom.js
Generalized Large-Context Language Models Based on Forward-Backward Hierarchical Recurrent Encoder-Decoder Models | IEEE Conference Publication | IEEE Xplore

Generalized Large-Context Language Models Based on Forward-Backward Hierarchical Recurrent Encoder-Decoder Models


Abstract:

This paper presents a generalized form of large-context language models (LCLMs) that can take linguistic contexts beyond utterance boundaries into consideration. In disco...Show More

Abstract:

This paper presents a generalized form of large-context language models (LCLMs) that can take linguistic contexts beyond utterance boundaries into consideration. In discourse-level and conversation-level automatic speech recognition (ASR) tasks, which have to handle a series of utterances, it is essential to capture long-range linguistic contexts beyond utterance boundaries. The LCLMs of previous studies mainly focused on utilizing past contexts, and none fully utilized future contexts because LMs typically process words in a time-ordered manner. Our key idea is to introduce the LCLMs into the situation where ASR results of the whole series of utterances are given by a first decoding pass. This situation makes it possible for the LCLMs to leverage future contexts. In this paper, we propose generalized LCLMs (GLCLMs) based on forward-backward hierarchical recurrent encoder-decoder models in which generative probabilities of individual utterances are computed by leveraging not only past contexts but also future contexts beyond utterance boundaries. In order to efficiently introduce GLCLMs to ASR, we also propose a global-context iterative rescoring method that repeatedly rescores the ASR hypotheses of an individual utterance by using surrounding ASR hypotheses. Experiments on discourse-level ASR tasks demonstrate the effectiveness of our GLCLM approach.
Date of Conference: 14-18 December 2019
Date Added to IEEE Xplore: 20 February 2020
ISBN Information:
Conference Location: Singapore

Contact IEEE to Subscribe

References

References is not available for this document.