Abstract
Neural topic models have attracted much attention for their high efficiencies in training, in which, the methods based on variational auto-encoder capture approximative distributions of data, and those based on Generative Adversarial Net (GAN) are able to capture an accurate posterior distribution. However, the existing GAN-based neural topic model fails to model the document-topic distribution of input samples, making it difficult to get the representations of data in the latent topic space for downstream tasks. Moreover, to utilize the topics discovered by these topic models, it is time-consuming to manually interpret the meaning of topics, label the generated topics, and filter out interested topics. To address these limitations, we propose a Reward-Modulated Adversarial Topic Model (RMATM). By integrating a topic predictor and a reward function in GAN, our RMATM can capture the document-topic distribution and discover interested topics according to topic-related seed words. Furthermore, benefit from the reward function using topic-related seed words as weak supervision, RMATM is able to classify unlabeled documents. Extensive experiments on four benchmark corpora have well validated the effectiveness of RMATM.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
- 2.
- 3.
- 4.
- 5.
- 6.
- 7.
- 8.
- 9.
- 10.
- 11.
References
Blei, D.M., Ng, A.Y., Jordan, M.I.: Latent Dirichlet allocation. J. Mach. Learn. Res. 3, 993–1022 (2003)
Fowlkes, E.B., Mallows, C.L.: A method for comparing two hierarchical clusterings. J. Am. Stat. Assoc. 78(383), 553–569 (1983)
Goodfellow, I.J., et al.: Generative adversarial nets. In: NIPS, pp. 2672–2680 (2014)
Gulrajani, I., Ahmed, F., Arjovsky, M., Dumoulin, V., Courville, A.C.: Improved training of Wasserstein GANs. In: NIPS, pp. 5767–5777 (2017)
Hubert, L., Arabie, P.: Comparing partitions. J. Classif. 2(1), 193–218 (1985)
Kingma, D.P., Welling, M.: Auto-encoding variational Bayes. In: ICLR (2014)
Meng, Y., Shen, J., Zhang, C., Han, J.: Weakly-supervised hierarchical text classification. In: AAAI, pp. 6826–6833 (2019)
Miao, Y., Grefenstette, E., Blunsom, P.: Discovering discrete latent topics with neural variational inference. In: ICML, pp. 2410–2419 (2017)
Miao, Y., Yu, L., Blunsom, P.: Neural variational inference for text processing. In: ICML, pp. 1727–1736 (2016)
Mikolov, T., Chen, K., Corrado, G., Dean, J.: Efficient estimation of word representations in vector space. In: ICLR (2013)
Röder, M., Both, A., Hinneburg, A.: Exploring the space of topic coherence measures. In: WSDM, pp. 399–408 (2015)
Srivastava, A., Sutton, C.A.: Autoencoding variational inference for topic models. In: ICLR (2017)
Sutton, R.S., McAllester, D.A., Singh, S.P., Mansour, Y.: Policy gradient methods for reinforcement learning with function approximation. In: NIPS, pp. 1057–1063 (1999)
Vinh, N.X., Epps, J., Bailey, J.: Information theoretic measures for clusterings comparison: variants, properties, normalization and correction for chance. J. Mach. Learn. Res. 11, 2837–2854 (2010)
Wang, R., Zhou, D., He, Y.: ATM: adversarial-neural topic model. Inf. Process. Manage. 56(6) (2019)
Acknowledgment
The research described in this paper was supported by the National Natural Science Foundation of China (61972426).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Feng, Y., Feng, J., Rao, Y. (2020). Reward-Modulated Adversarial Topic Modeling. In: Nah, Y., Cui, B., Lee, SW., Yu, J.X., Moon, YS., Whang, S.E. (eds) Database Systems for Advanced Applications. DASFAA 2020. Lecture Notes in Computer Science(), vol 12112. Springer, Cham. https://doi.org/10.1007/978-3-030-59410-7_47
Download citation
DOI: https://doi.org/10.1007/978-3-030-59410-7_47
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-59409-1
Online ISBN: 978-3-030-59410-7
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)