Abstract:
Text generation is a typical nature language processing task, and is the basis of machine translation and question answering. Deep learning techniques can get good perfor...Show MoreMetadata
Abstract:
Text generation is a typical nature language processing task, and is the basis of machine translation and question answering. Deep learning techniques can get good performance on this task under the condition that huge number of parameters and mass of data are available for training. However, human beings do not learn in this way. People combine knowledge learned before and something new with only few samples. This process is called one-shot learning. In this paper, we propose a neocortex based computational model, Semantic Hierarchical Temporal Memory model (SHTM), for one-shot text generation. The model is refined from Hierarchical Temporal Memory model. LSTM is used for comparative study. Results on three public datasets show that SHTM performs much better than LSTM on the measures of mean precision and BLEU score. In addition, we utilize SHTM model to do question answering in the fashion of text generation and verifying its superiority.
Date of Conference: 09-12 October 2016
Date Added to IEEE Xplore: 09 February 2017
ISBN Information: