Abstract:
Latent Dirichlet Allocation(LDA) is one of the powerful techniques in extracting topics from a document. The original LDA takes the Bag-of-Word representation as the inpu...Show MoreMetadata
Abstract:
Latent Dirichlet Allocation(LDA) is one of the powerful techniques in extracting topics from a document. The original LDA takes the Bag-of-Word representation as the input and produces topic distributions in documents as output. The drawback of Bag-of-Word is that it represents each word with a plain one-hot encoding which does not encode the word level information. Later research in Natural Language Processing(NLP) demonstrate that word embeddings technique such as Skipgram model can provide a good representation in capturing the relationship and semantic information between words. In recent studies, many NLP tasks could gain better performance by applying the word embedding as the representation of words. In this paper, we propose Deep Word-Topic Latent Dirichlet Allocation(DWT-LDA), a new process for training LDA with word embedding. A neural network with word embedding is applied to the Collapsed Gibbs Sampling process as another choice for word topic assignment. To quantitatively evaluate our model, the topic coherence framework and topic diversity are the metrics used to compare between our approach and the original LDA. The experimental result shows that our method generates more coherent and diverse topics.
Published in: 2021 18th International Joint Conference on Computer Science and Software Engineering (JCSSE)
Date of Conference: 30 June 2021 - 02 July 2021
Date Added to IEEE Xplore: 27 July 2021
ISBN Information: