skip to main content
10.1145/3357384.3358127acmconferencesArticle/Chapter ViewAbstractPublication PagescikmConference Proceedingsconference-collections
short-paper

Tensor Decomposition-based Node Embedding

Published:03 November 2019Publication History

ABSTRACT

In recent years, node embedding algorithms, which learn low dimensional vector representations for nodes in a graph, have been one of the key research interests of the graph mining community. The existing algorithms either rely on computationally expensive eigendecomposition of the large matrices, or require tuning of the word embedding-based hyperparameters as a result of representing the graph as a node sequence similar to the sentences in a document. Moreover, the latent features produced by these algorithms are hard to interpret. In this paper, we present Tensor Decomposition-based Node Embedding (TDNE), a novel model for learning node representations for arbitrary types of graphs: undirected, directed, and/or weighted. Our model preserves the local and global structural properties of a graph by constructing a third-order tensor using the k-step transition probability matrices and decomposing the tensor through CANDECOMP/PARAFAC (CP) decomposition in order to produce an interpretable, low dimensional vector space for the nodes. Our experimental evaluation using two well-known social network datasets proves TDNE to be interpretable with respect to the understandability of the feature space, and precise with respect to the network reconstruction.

References

  1. M. Belkin and P. Niyogi. 2001. Laplacian Eigenmaps and Spectral Techniques for Embedding and Clustering. In NIPS, 2001.Google ScholarGoogle Scholar
  2. S. Cao, W. Lu, and Q. Xu. 2015. GraRep: Learning Graph Representations with Global Structural Information. In CIKM, 2015.Google ScholarGoogle Scholar
  3. P. Goyal and E. Ferrara. 2018. Graph embedding techniques, applications, and performance: A survey. Knowledge Based Systems 151 (2018), 78--94.Google ScholarGoogle ScholarCross RefCross Ref
  4. A. Grover and J. Leskovec. 2016. node2vec: Scalable Feature Learning for Networks. In KDD, 2016.Google ScholarGoogle ScholarDigital LibraryDigital Library
  5. T. Kolda and B. Bader. 2009. Tensor decompositions and applications. SIAM review 51, 3 (2009), 455--500.Google ScholarGoogle Scholar
  6. T. Mikolov, K. Chen, G. Corrado, and J. Dean. 2013. Efficient Estimation of Word Representations in Vector Space. In ICLR, 2013.Google ScholarGoogle Scholar
  7. M. Ou, P. Cui, J. Pei, Z. Zhang, and W. Zhu. 2016. Asymmetric Transitivity Preserving Graph Embedding. In KDD, 2016.Google ScholarGoogle Scholar
  8. B. Perozzi, R. Al-Rfou, and S. Skiena. 2014. DeepWalk: online learning of social representations. In KDD, 2014.Google ScholarGoogle Scholar
  9. S. Roweis and L. Saul. 2000. Nonlinear dimensionality reduction by locally linear embedding. Science 290, 5500 (2000), 2323--2326.Google ScholarGoogle ScholarCross RefCross Ref
  10. N. Sidiropoulos, L. De Lathauwer, X. Fu, K. Huang, E. Papalexakis, and C. Faloutsos. 2017. Tensor decomposition for signal processing and machine learning. IEEE Trans. on Signal Processing 65, 13 (2017), 3551--3582.Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. W. Zachary. 1977. An information flow model for conflict and fission in small groups. Journal of anthropological research 33, 4 (1977), 452--473.Google ScholarGoogle ScholarCross RefCross Ref
  12. Z. Zhang, P. Cui, X. Wang, J. Pei, X. Yao, and W. Zhu. 2018. Arbitrary-Order Proximity Preserved Network Embedding. In KDD, 2018.Google ScholarGoogle Scholar

Index Terms

  1. Tensor Decomposition-based Node Embedding

        Recommendations

        Comments

        Login options

        Check if you have access through your login credentials or your institution to get full access on this article.

        Sign in
        • Published in

          cover image ACM Conferences
          CIKM '19: Proceedings of the 28th ACM International Conference on Information and Knowledge Management
          November 2019
          3373 pages
          ISBN:9781450369763
          DOI:10.1145/3357384

          Copyright © 2019 ACM

          Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

          Publisher

          Association for Computing Machinery

          New York, NY, United States

          Publication History

          • Published: 3 November 2019

          Permissions

          Request permissions about this article.

          Request Permissions

          Check for updates

          Qualifiers

          • short-paper

          Acceptance Rates

          CIKM '19 Paper Acceptance Rate202of1,031submissions,20%Overall Acceptance Rate1,861of8,427submissions,22%

          Upcoming Conference

        PDF Format

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader