skip to main content
10.1145/3488560.3498480acmconferencesArticle/Chapter ViewAbstractPublication PageswsdmConference Proceedingsconference-collections
research-article

Scalable Graph Topology Learning via Spectral Densification

Published: 15 February 2022 Publication History

Abstract

Graph learning plays an important role in many data mining and machine learning tasks, such as manifold learning, data representation and analysis, dimensionality reduction, data clustering, and visualization, etc. In this work, we introduce a highly-scalable spectral graph densification approach (GRASPEL) for graph topology learning from data. By limiting the precision matrix to be a graph-Laplacian-like matrix, our approach aims to learn sparse undirected graphs from potentially high-dimensional input data. A very unique property of the graphs learned by GRASPEL is that the spectral embedding (or approximate effective-resistance) distances on the graph will encode the similarities between the original input data points. By leveraging high-performance spectral methods, sparse yet spectrally-robust graphs can be learned by identifying and including the most spectrally-critical edges into the graph. Compared with prior state-of-the-art graph learning approaches, GRASPEL is more scalable and allows substantially improving computing efficiency and solution quality of a variety of data mining and machine learning applications, such as manifold learning, spectral clustering (SC), and dimensionality reduction (DR).

Supplementary Material

MP4 File (WSDM22-Feng.mp4)
A 10-min presentation for the WSDM'22 paper "Scalable Graph Topology Learning via Spectral Densification"

References

[1]
M. Belkin and P. Niyogi. Laplacian eigenmaps for dimensionality reduction and data representation. Neural computation, 15(6):1373--1396, 2003.
[2]
C. Carey. Graph Construction for Manifold Discovery. PhD thesis, University of Massachusetts Amherst, 2017.
[3]
W.-Y. Chen, Y. Song, H. Bai, C.-J. Lin, and E. Y. Chang. Parallel spectral clustering in distributed systems. IEEE transactions on pattern analysis and machine intelligence, 33(3):568--586, 2011.
[4]
X. Dong, D. Thanou, P. Frossard, and P. Vandergheynst. Learning laplacian matrix in smooth graph signal representations. IEEE Transactions on Signal Processing, 64(23):6160--6173, 2016.
[5]
X. Dong, D. Thanou, M. Rabbat, and P. Frossard. Learning graphs from data: A signal representation perspective. IEEE Signal Processing Magazine, 36(3):44--63, 2019.
[6]
H. E. Egilmez, E. Pavez, and A. Ortega. Graph learning from data under laplacian and structural constraints. IEEE Journal of Selected Topics in Signal Processing, 11(6):825--841, 2017.
[7]
J. Friedman, T. Hastie, and R. Tibshirani. Sparse inverse covariance estimation with the graphical lasso. Biostatistics, 9(3):432--441, 2008.
[8]
T. Jebara, J. Wang, and S.-F. Chang. Graph construction and b-matching for semi-supervised learning. In Proceedings of the 26th annual international conference on machine learning, pages 441--448. ACM, 2009.
[9]
V. Kalofolias. How to learn a graph from smooth signals. In Artificial Intelligence and Statistics, pages 920--929, 2016.
[10]
V. Kalofolias and N. Perraudin. Large scale graph learning from smooth signals. International Conference on Learning Representations (ICLR 2019), 2019.
[11]
I. Koutis, G. Miller, and R. Peng. Approaching Optimality for Solving SDD Linear Systems. In Proc. IEEE FOCS, pages 235--244, 2010.
[12]
S. Kumar, J. Ying, J. V. de Miranda Cardoso, and D. Palomar. Structured graph learning via laplacian spectral constraints. In Advances in Neural Information Processing Systems, pages 11651--11663, 2019.
[13]
B. Lake and J. Tenenbaum. Discovering structure by learning sparse graphs. 2010.
[14]
G. C. Linderman and S. Steinerberger. Clustering with t-sne, provably. arXiv e-print, arXiv:1706.02582, 2017.
[15]
Y. Liu, Q. Gao, Z. Yang, and S. Wang. Learning with adaptive neighbors for image clustering. In IJCAI, pages 2483--2489, 2018.
[16]
L. v. d. Maaten and G. Hinton. Visualizing Data using t-SNE. Journal of machine learning research, 9(Nov):2579--2605, 2008.
[17]
M. Maier, U. V. Luxburg, and M. Hein. Influence of graph construction on graph-based clustering measures. In Advances in neural information processing systems, pages 1025--1032, 2009.
[18]
M. Muja and D. G. Lowe. Fast approximate nearest neighbors with automatic algorithm configuration. VISAPP (1), 2(331--340):2, 2009.
[19]
A. Y. Ng, M. I. Jordan, and Y. Weiss. On spectral clustering: Analysis and an algorithm. NIPS, 14(2):849--856, 2001.
[20]
C. H. Papadimitrou and K. Steiglitz. Combinatorial optimization: algorithms and complexity. 1982.
[21]
V. Premachandran and R. Kakarala. Consensus of k-nns for robust neighborhood selection on graph-based manifolds. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pages 1594--1601, 2013.
[22]
M. G. Rabbat. Inferring sparse graphs from smooth signals with theoretical guarantees. In 2017 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), pages 6533--6537. IEEE, 2017.
[23]
S. T. Roweis and L. K. Saul. Nonlinear dimensionality reduction by locally linear embedding. science, 290(5500):2323--2326, 2000.
[24]
J. Shi and J. Malik. Normalized cuts and image segmentation. IEEE Transactions on pattern analysis and machine intelligence, 22(8):888--905, 2000.
[25]
M. Slawski and M. Hein. Estimation of positive definite m-matrices and structure learning for attractive gaussian markov random fields. Linear Algebra and its Applications, 473:145--179, 2015.
[26]
D. Spielman and N. Srivastava. Graph Sparsification by Effective Resistances. SIAM Journal on Computing, 40(6):1913--1926, 2011.
[27]
D. A. Spielman and S.-H. Teng. Nearly linear time algorithms for preconditioning and solving symmetric, diagonally dominant linear systems. SIAM Journal on Matrix Analysis and Applications, 35(3):835--885, 2014.
[28]
A. Strehl and J. Ghosh. Cluster ensembles--a knowledge reuse framework for combining multiple partitions. Journal of machine learning research, 3(Dec):583--617, 2002.
[29]
J. B. Tenenbaum, V. De Silva, and J. C. Langford. A global geometric framework for nonlinear dimensionality reduction. science, 290(5500):2319--2323, 2000.
[30]
L. Van Der Maaten. Accelerating t-SNE Using Tree-based Algorithms. The Journal of Machine Learning Research, 15(1):3221--3245, 2014.
[31]
U. Von Luxburg. A tutorial on spectral clustering. Statistics and computing, 17(4):395--416, 2007.
[32]
H. Xiao, K. Rasul, and R. Vollgraf. Fashion-mnist: a novel image dataset for benchmarking machine learning algorithms, 2017.
[33]
Z. Zhao, Y. Zhang, and Z. Feng. Towards scalable spectral embedding and data visualization via spectral coarsening. In Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pages 869--877, 2021.

Cited By

View all
  • (2024)Boosting Graph Spectral Sparsification via Parallel Sparse Approximate Inverse of Cholesky FactorProceedings of the 29th Asia and South Pacific Design Automation Conference10.1109/ASP-DAC58780.2024.10473809(866-871)Online publication date: 22-Jan-2024
  • (2023)Bridging the Gap between Spatial and Spectral Domains: A Unified Framework for Graph Neural NetworksACM Computing Surveys10.1145/362781656:5(1-42)Online publication date: 8-Dec-2023
  • (2023)Fixed Point Laplacian Mapping: A Geometrically Correct Manifold Learning Algorithm2023 International Joint Conference on Neural Networks (IJCNN)10.1109/IJCNN54540.2023.10191488(1-9)Online publication date: 18-Jun-2023
  • Show More Cited By

Index Terms

  1. Scalable Graph Topology Learning via Spectral Densification

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      WSDM '22: Proceedings of the Fifteenth ACM International Conference on Web Search and Data Mining
      February 2022
      1690 pages
      ISBN:9781450391320
      DOI:10.1145/3488560
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 15 February 2022

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. dimensionality reduction
      2. graph topology learning
      3. spectral clustering
      4. spectral graph theory

      Qualifiers

      • Research-article

      Funding Sources

      • National Science Foundation

      Conference

      WSDM '22

      Acceptance Rates

      Overall Acceptance Rate 498 of 2,863 submissions, 17%

      Upcoming Conference

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)28
      • Downloads (Last 6 weeks)5
      Reflects downloads up to 13 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2024)Boosting Graph Spectral Sparsification via Parallel Sparse Approximate Inverse of Cholesky FactorProceedings of the 29th Asia and South Pacific Design Automation Conference10.1109/ASP-DAC58780.2024.10473809(866-871)Online publication date: 22-Jan-2024
      • (2023)Bridging the Gap between Spatial and Spectral Domains: A Unified Framework for Graph Neural NetworksACM Computing Surveys10.1145/362781656:5(1-42)Online publication date: 8-Dec-2023
      • (2023)Fixed Point Laplacian Mapping: A Geometrically Correct Manifold Learning Algorithm2023 International Joint Conference on Neural Networks (IJCNN)10.1109/IJCNN54540.2023.10191488(1-9)Online publication date: 18-Jun-2023
      • (2023)A Novel 3D Image Processing Method Based on Spectral Layout2023 8th International Conference on Signal and Image Processing (ICSIP)10.1109/ICSIP57908.2023.10270920(370-374)Online publication date: 8-Jul-2023
      • (2023)An Improved Robust ClusterGAN with the Perturbation Attack2023 3rd International Conference on Robotics, Automation and Intelligent Control (ICRAIC)10.1109/ICRAIC61978.2023.00054(265-270)Online publication date: 24-Nov-2023
      • (2023)Improving the Homophily of Heterophilic Graphs for Semi-Supervised Node Classification2023 IEEE International Conference on Multimedia and Expo (ICME)10.1109/ICME55011.2023.00320(1865-1870)Online publication date: Jul-2023
      • (2023)Hierarchical and Contrastive Representation Learning for Knowledge-Aware Recommendation2023 IEEE International Conference on Multimedia and Expo (ICME)10.1109/ICME55011.2023.00184(1050-1055)Online publication date: Jul-2023
      • (2023)DeepRicci: Self-supervised Graph Structure-Feature Co-Refinement for Alleviating Over-squashing2023 IEEE International Conference on Data Mining (ICDM)10.1109/ICDM58522.2023.00065(558-567)Online publication date: 1-Dec-2023
      • (2023)Structured-Anchor Projected Clustering for Hyperspectral ImagesICASSP 2023 - 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)10.1109/ICASSP49357.2023.10096622(1-5)Online publication date: 4-Jun-2023
      • (2022)Improving Spectral Clustering Using Spectrum-Preserving Node Aggregation2022 26th International Conference on Pattern Recognition (ICPR)10.1109/ICPR56361.2022.9956605(3063-3068)Online publication date: 21-Aug-2022
      • Show More Cited By

      View Options

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media