Abstract
In the field of document clustering (or dictionary learning), the fitting error called the Wasserstein (In this paper, we use “Wasserstein”, “Earth Mover’s”, “Kantorovich–Rubinstein” interchangeably) distance showed some advantages for measuring the approximation of the original data. Further, It is able to capture redundant information, for instance synonyms in bag-of-words, which in practice cannot be retrieved using classical metrics. However, despite the use of smoothed approximation allowing faster computations, this distance suffers from its high computational cost and remains uneasy to handle with a substantial amount of data. To circumvent this issue, we propose a different scheme of NMF relying on the Kullback-Leibler divergence for the term approximating the original data and a regularization term consisting in the approximation of the Wasserstein embeddings in order to leverage more semantic relations. With experiments on benchmark datasets, the results show that our proposal achieves good clustering and support for visualizing the clusters.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Affeldt, S., Labiod, L., Nadif, M.: Ensemble block co-clustering: a unified framework for text data. In: CIKM (2020)
Ailem, M., Role, F., Nadif, M.: Co-clustering document-term matrices by direct maximization of graph modularity. In: CIKM, pp. 1807–1810 (2015)
Ailem, M., Salah, A., Nadif, M.: Non-negative matrix factorization meets word embedding. In: SIGIR, pp. 1081–1084 (2017)
Allab, K., Labiod, L., Nadif, M.: A semi-NMF-PCA unified framework for data clustering. IEEE Trans. Knowl. Data Eng. 29(1), 2–16 (2016)
Buchta, C., Kober, M., Feinerer, I., Hornik, K.: Spherical k-means clustering. J. Stat. Softw. 50(10), 1–22 (2012)
Cai, D., He, X., Han, J., Huang, T.S.: Graph regularized nonnegative matrix factorization for data representation. IEEE Trans. Pattern Anal. Mach. Intell. 33(8), 1548–1560 (2010)
Cuturi, M.: Sinkhorn distances: lightspeed computation of optimal transport. In: Advances in Neural Information Processing Systems, pp. 2292–2300 (2013)
Cuturi, M., Doucet, A.: Fast computation of Wasserstein barycenters. In: International Conference on Machine Learning, pp. 685–693 (2014)
Ding, C., Li, T., Peng, W., Park, H.: Orthogonal nonnegative matrix t-factorizations for clustering. In: SIGKDD, pp. 126–135 (2006)
Genevay, A., Cuturi, M., Peyré, G., Bach, F.: Stochastic optimization for large-scale optimal transport. In: Advances in Neural Information Processing Systems, pp. 3440–3448 (2016)
Hubert, L., Arabie, P.: Comparing partitions. J. Classif. 2(1), 193–218 (1985)
Labiod, L., Nadif, M.: Co-clustering under nonnegative matrix tri-factorization. In: Lu, B.-L., Zhang, L., Kwok, J. (eds.) ICONIP 2011. LNCS, vol. 7063, pp. 709–717. Springer, Heidelberg (2011). https://doi.org/10.1007/978-3-642-24958-7_82
Lee, D.D., Seung, H.S.: Learning the parts of objects by non-negative matrix factorization. Nature 401(6755), 788 (1999)
Li, T., Ding, C.: Nonnegative matrix factorizations for clustering: a survey. In: Data Clustering, pP. 149–176. Chapman and Hall/CRC (2018)
Ling, H., Okada, K.: An efficient earth mover’s distance algorithm for robust histogram comparison. IEEE Trans. Pattern Anal. Mach. Intell. 29(5), 840–853 (2007)
McInnes, L., Healy, J., Melville, J.: Umap: Uniform manifold approximation and projection for dimension reduction. arXiv preprint arXiv:1802.03426 (2018)
Mi, L., Zhang, W., Gu, X., Wang, Y.: Variational Wasserstein clustering. In: Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y. (eds.) ECCV 2018. LNCS, vol. 11219, pp. 336–352. Springer, Cham (2018). https://doi.org/10.1007/978-3-030-01267-0_20
Muzellec, B., Nock, R., Patrini, G., Nielsen, F.: Tsallis regularized optimal transport and ecological inference. In: AAAI (2017)
Pele, O., Werman, M.: Fast and robust earth mover’s distances. In: CCV, pp. 460–467 (2009)
Rolet, A., Cuturi, M., Peyré, G.: Fast dictionary learning with a smoothed Wasserstein loss. In: Artificial Intelligence and Statistics, pp. 630–638 (2016)
Salah, A., Nadif, M.: Directional co-clustering. Adv. Data Anal. Classif. 13(3), 591–620 (2018). https://doi.org/10.1007/s11634-018-0323-4
Salah, A., Ailem, M., Nadif, M.: A way to boost semi-NMF for document clustering. In: CIKM, pp. 2275–2278 (2017)
Salah, A., Ailem, M., Nadif, M.: Word co-occurrence regularized non-negative matrix tri-factorization for text data co-clustering. In: AAAI, pp. 3992–3999 (2018)
Sandler, R., Lindenbaum, M.: Nonnegative matrix factorization with earth mover’s distance metric. In: CVPR, pp. 1873–1880 (2009)
Schmitz, M.A., et al.: Wasserstein dictionary learning: optimal transport-based unsupervised nonlinear dictionary learning. SIAM J. Imaging Sci. 11(1), 643–678 (2018)
Shirdhonkar, S., Jacobs, D.W.: Approximate earth mover’s distance in linear time. In: 2008 IEEE Conference on Computer Vision and Pattern Recognition, pp. 1–8. IEEE (2008)
Strehl, A., Ghosh, J.: Cluster ensembles–a knowledge reuse framework for combining multiple partitions. J. Mach. Learn. Res. 3(Dec), 583–617 (2002)
Villani, C.: Optimal Transport: Old and New, vol. 338. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-71050-9
Yoo, J., Choi, S.: Orthogonal nonnegative matrix factorization: multiplicative updates on Stiefel manifolds. In: Fyfe, C., Kim, D., Lee, S.-Y., Yin, H. (eds.) IDEAL 2008. LNCS, vol. 5326, pp. 140–147. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-88906-9_18
Yuan, Z., Oja, E.: Projective nonnegative matrix factorization for image compression and feature extraction. In: Kalviainen, H., Parkkinen, J., Kaarna, A. (eds.) SCIA 2005. LNCS, vol. 3540, pp. 333–342. Springer, Heidelberg (2005). https://doi.org/10.1007/11499145_35
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2020 Springer Nature Switzerland AG
About this paper
Cite this paper
Febrissy, M., Nadif, M. (2020). Wasserstein Embeddings for Nonnegative Matrix Factorization. In: Nicosia, G., et al. Machine Learning, Optimization, and Data Science. LOD 2020. Lecture Notes in Computer Science(), vol 12565. Springer, Cham. https://doi.org/10.1007/978-3-030-64583-0_29
Download citation
DOI: https://doi.org/10.1007/978-3-030-64583-0_29
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-64582-3
Online ISBN: 978-3-030-64583-0
eBook Packages: Computer ScienceComputer Science (R0)