Abstract
This paper proposes a time-aware topic model and its mini-batch variational inference for exploring chronological trends in document contents. Our contribution is twofold. First, to extract topics in a time-aware manner, our method uses two vector embeddings: the embedding of latent topics and that of document timestamps. By combining these two embeddings and applying the softmax function, we have as many word probability distributions as document timestamps for each topic. This modeling enables us to extract remarkable topical trends. Second, to achieve memory efficiency, the variational inference is implemented as mini-batch gradient ascent maximizing the evidence lower bound. This enables us to perform parameter estimation in the way similar to neural networks. Our method was actually implemented with deep learning framework. The evaluation results show that we could improve test set perplexity by using document timestamps and also that our test perplexity was comparable with that of collapsed Gibbs sampling, which is less efficient in memory usage than the proposed inference.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Asuncion, A., Welling, M., Smyth, P., Teh, Y.W.: On smoothing and inference for topic models. In: Proceedings of UAI, pp. 27–34 (2009)
Blei, D.M., Lafferty, J.D.: Dynamic topic models. In: Proceedings of ICML, pp. 113–120 (2006)
Blei, D.M., Ng, A.Y., Jordan, M.I.: Latent Dirichlet allocation. J. Mach. Learn. Res. 3, 993–1022 (2003)
Broderick, T., Boyd, N., Wibisono, A., Wilson, A.C., Jordan, M.I.: Streaming variational Bayes. In: Proceedings of NIPS, pp. 1727–1735 (2013)
Chen, S.F., Goodman, J.: An empirical study of smoothing techniques for language modeling. In: Proceedings of ACL, pp. 310–318 (1996)
Dieng, A.B., Wang, C., Gao, J., Paisley, J.: TopicRNN: a recurrent neural network with long-range semantic dependency. arXiv:1611.01702 (2016)
Eisenstein, J., Ahmed, A., Xing, E.P.: Sparse additive generative models of text. In: Proceedings of ICML, pp. 1041–1048 (2011)
Griffiths, T.L., Steyvers, M.: Finding scientific topics. Proc. Natl. Acad. Sci. U.S.A. 101(Suppl. 1), 5235–5288 (2004)
Hoffman, M.D., Blei, D.M., Bach, F.: Online learning for latent Dirichlet allocation. In: Proceedings of NIPS, pp. 856–864 (2010)
Kingma, D.P., Welling, M.: Auto-encoding variational bayes. In: Proceedings of ICLR (2014)
Kingma, D.P., Ba, J.L.: Adam: a method for stochastic optimization. In: Proceedings of ICLR (2015)
Mikolov, T., Zweig, G.: Context dependent recurrent neural network language model. In: Proceedings of IEEE Spoken Language Technology, Workshop, pp. 234–239 (2012)
Roberts, M.E., Stewart, B.M., Tingley, D., Airoldi, E.M.: The structural topic model and applied social science. In: Proceedings of Advances in Neural Information Processing Systems Workshop on Topic Models: Computation, Application, and Evaluation (2013)
Srivastava, A., Sutton, C.: Autoencoding variational inference for topic models. In: Proceedings of ICLR (2017)
Wan, L., Zeiler, M., Zhang, S., Cun, Y.L., Fergus, R.: Regularization of neural networks using DropConnect. In: Proceedings of ICML, pp. 1058–1066 (2013)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Masada, T., Takasu, A. (2018). Mini-Batch Variational Inference for Time-Aware Topic Modeling. In: Geng, X., Kang, BH. (eds) PRICAI 2018: Trends in Artificial Intelligence. PRICAI 2018. Lecture Notes in Computer Science(), vol 11013. Springer, Cham. https://doi.org/10.1007/978-3-319-97310-4_18
Download citation
DOI: https://doi.org/10.1007/978-3-319-97310-4_18
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-97309-8
Online ISBN: 978-3-319-97310-4
eBook Packages: Computer ScienceComputer Science (R0)