skip to main content
10.1145/3508352.3549438acmconferencesArticle/Chapter ViewAbstractPublication PagesiccadConference Proceedingsconference-collections
research-article

HyperEF: Spectral Hypergraph Coarsening by Effective-Resistance Clustering

Published: 22 December 2022 Publication History

Abstract

This paper introduces a scalable algorithmic framework (HyperEF) for spectral coarsening (decomposition) of large-scale hypergraphs by exploiting hyperedge effective resistances. Motivated by the latest theoretical framework for low-resistance-diameter decomposition of simple graphs, HyperEF aims at decomposing large hypergraphs into multiple node clusters with only a few inter-cluster hyperedges. The key component in HyperEF is a nearly-linear time algorithm for estimating hyperedge effective resistances, which allows incorporating the latest diffusion-based non-linear quadratic operators defined on hypergraphs. To achieve good runtime scalability, HyperEF searches within the Krylov subspace (or approximate eigensubspace) for identifying the nearly-optimal vectors for approximating the hyperedge effective resistances. In addition, a node weight propagation scheme for multilevel spectral hypergraph decomposition has been introduced for achieving even greater node coarsening ratios. When compared with state-of-the-art hypergraph partitioning (clustering) methods, extensive experiment results on real-world VLSI designs show that HyperEF can more effectively coarsen (decompose) hypergraphs without losing key structural (spectral) properties of the original hypergraphs, while achieving over 70× runtime speedups over hMetis and 20× speedups over HyperSF.

References

[1]
A. Aghdaei, Z. Zhao, and Z. Feng. Hypersf: Spectral hypergraph coarsening via flow-based local clustering. CoRR, abs/2108.07901, 2021.
[2]
V. L. Alev, N. Anari, L. C. Lau, and S. Oveis Gharan. Graph clustering using effective resistance. In 9th Innovations in Theoretical Computer Science Conference (ITCS 2018). Schloss Dagstuhl-Leibniz-Zentrum fuer Informatik, 2018.
[3]
Ü. V. Çatalyürek and C. Aykanat. Patoh (partitioning tool for hypergraphs). In Encyclopedia of Parallel Computing, pages 1479--1487. Springer, 2011.
[4]
T.-H. H. Chan and Z. Liang. Generalizing the hypergraph laplacian via a diffusion process with mediators. Theoretical Computer Science, 806:416--428, 2020.
[5]
T.-H. H. Chan, A. Louis, Z. G. Tang, and C. Zhang. Spectral properties of hypergraph laplacian and approximation algorithms. Journal of the ACM (JACM), 65(3):15, 2018.
[6]
J. Chen, Y. Saad, and Z. Zhang. Graph coarsening: from scientific computing to machine learning. SeMA Journal, 79(1):187--223, 2022.
[7]
C. Deng, Z. Zhao, Y. Wang, Z. Zhang, and Z. Feng. GraphZoom: A Multi-level Spectral Approach for Accurate and Scalable Graph Embedding. In International Conference on Learning Representations, 2019.
[8]
K. D. Devine, E. G. Boman, R. T. Heaphy, R. H. Bisseling, and U. V. Catalyurek. Parallel hypergraph partitioning for scientific computing. In Proceedings 20th IEEE International Parallel & Distributed Processing Symposium, pages 10--pp. IEEE, 2006.
[9]
Z. Feng. Similarity-aware spectral sparsification by edge filtering. In Design Automation Conference (DAC). IEEE, 2018.
[10]
Z. Feng. Grass: Graph spectral sparsification leveraging scalable spectral perturbation analysis. IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 39(12):4944--4957, 2020.
[11]
A. Grover and J. Leskovec. node2vec: Scalable Feature Learning for Networks. In Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining, pages 855--864. ACM, 2016.
[12]
L. Hagen and A. Kahng. New spectral methods for ratio cut partitioning and clustering. IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems, 11(9):1074--1085, 1992.
[13]
V. Iakovlev, M. Heinonen, and H. Lähdesmäki. Learning continuous-time pdes from sparse data with graph neural networks. In International Conference on Learning Representations, 2020.
[14]
M. Kapralov, R. Krauthgamer, J. Tardos, and Y. Yoshida. Towards tight bounds for spectral sparsification of hypergraphs. In Proceedings of the 53rd Annual ACM SIGACT Symposium on Theory of Computing, pages 598--611, 2021.
[15]
M. Kapralov, R. Krauthgamer, J. Tardos, and Y. Yoshida. Spectral hypergraph sparsifiers of nearly linear size. In 2021 IEEE 62nd Annual Symposium on Foundations of Computer Science (FOCS), pages 1159--1170. IEEE, 2022.
[16]
G. Karypis, R. Aggarwal, V. Kumar, and S. Shekhar. Multilevel hypergraph partitioning: Applications in vlsi domain. IEEE Transactions on Very Large Scale Integration (VLSI) Systems, 7(1):69--79, 1999.
[17]
K. Kunal, T. Dhar, M. Madhusudan, J. Poojary, A. Sharma, W. Xu, S. M. Burns, J. Hu, R. Harjani, and S. S. Sapatnekar. Gana: Graph convolutional network based automated netlist annotation for analog circuits. In 2020 Design, Automation & Test in Europe Conference & Exhibition (DATE), pages 55--60. IEEE, 2020.
[18]
J. R. Lee, S. O. Gharan, and L. Trevisan. Multiway spectral partitioning and higher-order cheeger inequalities. Journal of the ACM (JACM), 61(6):1--30, 2014.
[19]
Y. T. Lee and H. Sun. An SDP-based Algorithm for Linear-sized Spectral Sparsification. In Proceedings of the 49th Annual ACM SIGACT Symposium on Theory of Computing, STOC 2017, pages 678--687, New York, NY, USA, 2017. ACM.
[20]
Z. Li, N. Kovachki, K. Azizzadenesheli, B. Liu, A. Stuart, K. Bhattacharya, and A. Anandkumar. Multipole graph neural operator for parametric partial differential equations. Advances in Neural Information Processing Systems, 33, 2020.
[21]
J. Lim, S. Ryu, K. Park, Y. J. Choe, J. Ham, and W. Y. Kim. Predicting drug-target interaction using a novel graph neural network with 3d structure-embedded graph representation. Journal of chemical information and modeling, 59(9):3981--3988, 2019.
[22]
A. Loukas and P. Vandergheynst. Spectrally approximating large graphs with smaller graphs. In International Conference on Machine Learning, pages 3237--3246. PMLR, 2018.
[23]
A. Mirhoseini, A. Goldie, M. Yazgan, J. W. Jiang, E. Songhori, S. Wang, Y.-J. Lee, E. Johnson, O. Pathak, and A. Nazi. A graph placement methodology for fast chip design. Nature, 594(7862):207--212, 2021.
[24]
X. Ouvrard. Hypergraphs: an introduction and review. arXiv preprint arXiv:2002.05014, 2020.
[25]
B. Perozzi, R. Al-Rfou, and S. Skiena. Deepwalk: Online Learning of Social Representations. In Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining, pages 701--710. ACM, 2014.
[26]
P. C. Rathi, R. F. Ludlow, and M. L. Verdonk. Practical high-quality electrostatic potential surfaces for drug discovery using a graph-convolutional deep neural network. Journal of Medicinal Chemistry, 2019.
[27]
I. Safro, P. Sanders, and C. Schulz. Advanced coarsening schemes for graph partitioning. Journal of Experimental Algorithmics (JEA), 19:1--24, 2015.
[28]
M. Schlichtkrull, T. N. Kipf, P. Bloem, R. Van Den Berg, I. Titov, and M. Welling. Modeling relational data with graph convolutional networks. In European Semantic Web Conference, pages 593--607. Springer, 2018.
[29]
R. Shaydulin, J. Chen, and I. Safro. Relaxation-based coarsening for multilevel hypergraph partitioning. Multiscale Modeling and Simulation, 17(1):482Ð506, Jan 2019.
[30]
T. Soma and Y. Yoshida. Spectral sparsification of hypergraphs. In Proceedings of the Thirtieth Annual ACM-SIAM Symposium on Discrete Algorithms, pages 2570--2581. SIAM, 2019.
[31]
D. Spielman and N. Srivastava. Graph Sparsification by Effective Resistances. SIAM Journal on Computing, 40(6):1913--1926, 2011.
[32]
D. Spielman and S. Teng. Spectral partitioning works: Planar graphs and finite element meshes. In Foundations of Computer Science (FOCS), 1996. Proceedings., 37th Annual Symposium on, pages 96--105. IEEE, 1996.
[33]
D. Spielman and S. Teng. Spectral sparsification of graphs. SIAM Journal on Computing, 40(4):981--1025, 2011.
[34]
B. Vastenhouw and R. H. Bisseling. A two-dimensional data distribution method for parallel sparse matrix-vector multiplication. SIAM review, 47(1):67--95, 2005.
[35]
H. Wang, K. Wang, J. Yang, L. Shen, N. Sun, H.-S. Lee, and S. Han. Gcn-rl circuit designer: Transferable transistor sizing with graph neural networks and reinforcement learning. In 2020 57th ACM/IEEE Design Automation Conference (DAC), pages 1--6. IEEE, 2020.
[36]
G. Zhang, H. He, and D. Katabi. Circuit-gnn: Graph neural networks for distributed circuit design. In International Conference on Machine Learning, pages 7364--7373. PMLR, 2019.
[37]
M. Zhang and Y. Chen. Link prediction based on graph neural networks. In Advances in Neural Information Processing Systems, pages 5165--5175, 2018.
[38]
Y. Zhang, Z. Zhao, and Z. Feng. Sf-grass: Solver-free graph spectral sparsification. In 2020 IEEE/ACM International Conference On Computer Aided Design (ICCAD), pages 1--8. IEEE, 2020.
[39]
Z. Zhao and Z. Feng. Effective-resistance preserving spectral reduction of graphs. In Proceedings of the 56th Annual Design Automation Conference 2019, DAC '19, pages 109:1--109:6, New York, NY, USA, 2019. ACM.
[40]
Z. Zhao, Y. Zhang, and Z. Feng. Towards scalable spectral embedding and data visualization via spectral coarsening. In Proceedings of the 14th ACM International Conference on Web Search and Data Mining, pages 869--877, 2021.
[41]
D. Zhou, J. Huang, and B. Schölkopf. Learning with hypergraphs: Clustering, classification, and embedding. Advances in neural information processing systems, 19:1601--1608, 2006.

Cited By

View all
  • (2024)SGM-PINN: Sampling Graphical Models for Faster Training of Physics-Informed Neural NetworksProceedings of the 61st ACM/IEEE Design Automation Conference10.1145/3649329.3656521(1-6)Online publication date: 23-Jun-2024
  • (2024)inGRASS: Incremental Graph Spectral Sparsification via Low-Resistance-Diameter DecompositionProceedings of the 61st ACM/IEEE Design Automation Conference10.1145/3649329.3656520(1-6)Online publication date: 23-Jun-2024
  • (2024)Boosting Graph Spectral Sparsification via Parallel Sparse Approximate Inverse of Cholesky FactorProceedings of the 29th Asia and South Pacific Design Automation Conference10.1109/ASP-DAC58780.2024.10473809(866-871)Online publication date: 22-Jan-2024

Index Terms

  1. HyperEF: Spectral Hypergraph Coarsening by Effective-Resistance Clustering
              Index terms have been assigned to the content through auto-classification.

              Recommendations

              Comments

              Information & Contributors

              Information

              Published In

              cover image ACM Conferences
              ICCAD '22: Proceedings of the 41st IEEE/ACM International Conference on Computer-Aided Design
              October 2022
              1467 pages
              ISBN:9781450392174
              DOI:10.1145/3508352
              Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

              Sponsors

              In-Cooperation

              • IEEE-EDS: Electronic Devices Society
              • IEEE CAS
              • IEEE CEDA

              Publisher

              Association for Computing Machinery

              New York, NY, United States

              Publication History

              Published: 22 December 2022

              Permissions

              Request permissions for this article.

              Check for updates

              Author Tags

              1. effective resistance
              2. graph clustering
              3. hypergraph coarsening
              4. spectral graph theory

              Qualifiers

              • Research-article

              Funding Sources

              • National Science Foundation

              Conference

              ICCAD '22
              Sponsor:
              ICCAD '22: IEEE/ACM International Conference on Computer-Aided Design
              October 30 - November 3, 2022
              California, San Diego

              Acceptance Rates

              Overall Acceptance Rate 457 of 1,762 submissions, 26%

              Contributors

              Other Metrics

              Bibliometrics & Citations

              Bibliometrics

              Article Metrics

              • Downloads (Last 12 months)44
              • Downloads (Last 6 weeks)2
              Reflects downloads up to 28 Feb 2025

              Other Metrics

              Citations

              Cited By

              View all
              • (2024)SGM-PINN: Sampling Graphical Models for Faster Training of Physics-Informed Neural NetworksProceedings of the 61st ACM/IEEE Design Automation Conference10.1145/3649329.3656521(1-6)Online publication date: 23-Jun-2024
              • (2024)inGRASS: Incremental Graph Spectral Sparsification via Low-Resistance-Diameter DecompositionProceedings of the 61st ACM/IEEE Design Automation Conference10.1145/3649329.3656520(1-6)Online publication date: 23-Jun-2024
              • (2024)Boosting Graph Spectral Sparsification via Parallel Sparse Approximate Inverse of Cholesky FactorProceedings of the 29th Asia and South Pacific Design Automation Conference10.1109/ASP-DAC58780.2024.10473809(866-871)Online publication date: 22-Jan-2024

              View Options

              Login options

              View options

              PDF

              View or Download as a PDF file.

              PDF

              eReader

              View online with eReader.

              eReader

              Figures

              Tables

              Media

              Share

              Share

              Share this Publication link

              Share on social media