skip to main content
10.1145/3489517.3530410acmconferencesArticle/Chapter ViewAbstractPublication PagesdacConference Proceedingsconference-collections
research-article

Functionality matters in netlist representation learning

Published: 23 August 2022 Publication History

Abstract

Learning feasible representation from raw gate-level netlists is essential for incorporating machine learning techniques in logic synthesis, physical design, or verification. Existing message-passing-based graph learning methodologies focus merely on graph topology while overlooking gate functionality, which often fails to capture underlying semantic, thus limiting their generalizability. To address the concern, we propose a novel netlist representation learning framework that utilizes a contrastive scheme to acquire generic functional knowledge from netlists effectively. We also propose a customized graph neural network (GNN) architecture that learns a set of independent aggregators to better cooperate with the above framework. Comprehensive experiments on multiple complex real-world designs demonstrate that our proposed solution significantly outperforms state-of-the-art netlist feature learning flows.

References

[1]
Y. Ma, H. Ren, B. Khailany, H. Sikka, L. Luo, K. Natarajan, and B. Yu, "High performance graph convolutional networks with applications in testability analysis," in Proc. DAC, 2019, pp. 1--6.
[2]
H. Geng, Y. Ma, Q. Xu, J. Miao, S. Roy, and B. Yu, "High-speed adder design space exploration via graph neural processes," IEEE TCAD, 2021.
[3]
G. Huang, J. Hu, Y. He, J. Liu, M. Ma, Z. Shen, J. Wu, Y. Xu, H. Zhang, K. Zhong et al., "Machine learning for electronic design automation: A survey," ACM TODAES, vol. 26, no. 5, pp. 1--46, 2021.
[4]
W. Li, A. Gascon, P. Subramanyan, W. Y. Tan, A. Tiwari, S. Malik, N. Shankar, and S. A. Seshia, "Wordrev: Finding word-level structures in a sea of bit-level gates," in Proc. HOST. IEEE, 2013, pp. 67--74.
[5]
A. Fayyazi, S. Shababi, P. Nuzzo, S. Nazarian, and M. Pedram, "Deep learning-based circuit recognition using sparse mapping and level-dependent decaying sum circuit representations," in Proc. DATE, 2019, pp. 638--641.
[6]
Z. He, Z. Wang, C. Bai, H. Yang, and B. YU, "Graph learning-based arithmetic block identification," in Proc. ICCAD, 2021.
[7]
K. Xu, W. Hu, J. Leskovec, and S. Jegelka, "How powerful are graph neural networks?" Proc. ICLR, 2018.
[8]
W. Hamilton, Z. Ying, and J. Leskovec, "Inductive representation learning on large graphs," in Proc. NIPS, 2017, pp. 1024--1034.
[9]
P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio, "Graph attention networks," Proc. ICLR, 2018.
[10]
M. Zhang and Y. Chen, "Link prediction based on graph neural networks," Proc. NIPS, vol. 31, pp. 5165--5175, 2018.
[11]
Y. Gong, Y. Zhu, L. Duan, Q. Liu, Z. Guan, F. Sun, W. Ou, and K. Q. Zhu, "Exact-K Recommendation via Maximal Clique Optimization," Proc. KDD, pp. 617--626, 2019.
[12]
T. Chen, Q. Sun, C. Zhan, C. Liu, H. Yu, and B. Yu, "Deep h-gcn: Fast analog ic aging-induced degradation estimation," IEEE TCAD, 2021.
[13]
V. Thost and J. Chen, "Directed acyclic graph neural networks," arXiv preprint arXiv:2101.07965, 01 2021.
[14]
M. Zhang, S. Jiang, Z. Cui, R. Garnett, and Y. Chen, "D-vae: A variational autoencoder for directed acyclic graphs," Proc. NIPS, 2019.
[15]
P. Bachman, R. D. Hjelm, and W. Buchwalter, "Learning Representations by Maximizing Mutual Information Across Views," Proc. NIPS, vol. 32, pp. 15 535--15 545, 2019.
[16]
T. Chen, S. Kornblith, M. Norouzi, and G. Hinton, "A simple framework for contrastive learning of visual representations," in Proc. ICML, 2020, pp. 1597--1607.
[17]
B. Poole, S. Ozair, A. Van Den Oord, A. Alemi, and G. Tucker, "On variational bounds of mutual information," in Proc. ICML, 2019, pp. 5171--5180.
[18]
P. Velickovic, W. Fedus, W. L. Hamilton, P. Liò, Y. Bengio, and R. D. Hjelm, "Deep graph infomax," Proc. ICLR, vol. 2, no. 3, p. 4, 2019.
[19]
Y. You, T. Chen, Y. Sui, T. Chen, Z. Wang, and Y. Shen, "Graph contrastive learning with augmentations," Proc. NIPS, vol. 33, pp. 5812--5823, 2020.
[20]
Y. Zhu, Y. Xu, F. Yu, Q. Liu, S. Wu, and L. Wang, "Graph contrastive learning with adaptive augmentation," in The Web Conference, 2021, pp. 2069--2080.
[21]
Y. Tian, C. Sun, B. Poole, D. Krishnan, C. Schmid, and P. Isola, "What makes for good views for contrastive learning?" arXiv preprint arXiv:2005.10243, 2020.
[22]
T. Xiao, X. Wang, A. A. Efros, and T. Darrell, "What should not be contrastive in contrastive learning," arXiv preprint arXiv:2008.05659, 2020.
[23]
T. Chen, Y. Sun, Y. Shi, and L. Hong, "On sampling strategies for neural network-based collaborative filtering," in Proc. KDD, 2017, pp. 767--776.
[24]
K. Sohn, "Improved deep metric learning with multi-class n-pair loss objective," in Proc. NIPS, 2016, pp. 1857--1865.
[25]
N. Homma, Y. Watanabe, T. Aoki, and T. Higuchi, "Formal design of arithmetic circuits based on arithmetic description language," IEICE Trans. Fundamentals, vol. 89, no. 12, pp. 3500--3509, 2006.

Cited By

View all
  • (2025)Learning Gate-level Netlist Testability in the Presence of Unknowns through Graph Neural NetworksProceedings of the 30th Asia and South Pacific Design Automation Conference10.1145/3658617.3697753(622-627)Online publication date: 20-Jan-2025
  • (2025)A Self-Supervised, Pre-Trained, and Cross-Stage-Aligned Circuit Encoder Provides a Foundation for Various Design TasksProceedings of the 30th Asia and South Pacific Design Automation Conference10.1145/3658617.3697597(505-512)Online publication date: 20-Jan-2025
  • (2025)DeepSeq2: Enhanced Sequential Circuit Learning with Disentangled RepresentationsProceedings of the 30th Asia and South Pacific Design Automation Conference10.1145/3658617.3697594(498-504)Online publication date: 20-Jan-2025
  • Show More Cited By

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Conferences
DAC '22: Proceedings of the 59th ACM/IEEE Design Automation Conference
July 2022
1462 pages
ISBN:9781450391429
DOI:10.1145/3489517
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 23 August 2022

Permissions

Request permissions for this article.

Check for updates

Qualifiers

  • Research-article

Conference

DAC '22
Sponsor:
DAC '22: 59th ACM/IEEE Design Automation Conference
July 10 - 14, 2022
California, San Francisco

Acceptance Rates

Overall Acceptance Rate 1,770 of 5,499 submissions, 32%

Upcoming Conference

DAC '25
62nd ACM/IEEE Design Automation Conference
June 22 - 26, 2025
San Francisco , CA , USA

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • Downloads (Last 12 months)348
  • Downloads (Last 6 weeks)21
Reflects downloads up to 05 Mar 2025

Other Metrics

Citations

Cited By

View all
  • (2025)Learning Gate-level Netlist Testability in the Presence of Unknowns through Graph Neural NetworksProceedings of the 30th Asia and South Pacific Design Automation Conference10.1145/3658617.3697753(622-627)Online publication date: 20-Jan-2025
  • (2025)A Self-Supervised, Pre-Trained, and Cross-Stage-Aligned Circuit Encoder Provides a Foundation for Various Design TasksProceedings of the 30th Asia and South Pacific Design Automation Conference10.1145/3658617.3697597(505-512)Online publication date: 20-Jan-2025
  • (2025)DeepSeq2: Enhanced Sequential Circuit Learning with Disentangled RepresentationsProceedings of the 30th Asia and South Pacific Design Automation Conference10.1145/3658617.3697594(498-504)Online publication date: 20-Jan-2025
  • (2025)FGNN2: A Powerful Pretraining Framework for Learning the Logic Functionality of CircuitsIEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems10.1109/TCAD.2024.343446444:1(227-240)Online publication date: Jan-2025
  • (2024)DeepSeq: Deep Sequential Circuit Learning2024 Design, Automation & Test in Europe Conference & Exhibition (DATE)10.23919/DATE58400.2024.10546639(1-2)Online publication date: 25-Mar-2024
  • (2024)GAN-Place: Advancing Open Source Placers to Commercial-quality Using Generative Adversarial Networks and Transfer LearningACM Transactions on Design Automation of Electronic Systems10.1145/363646129:2(1-17)Online publication date: 14-Feb-2024
  • (2024)Graph-Based Ranking Techniques for Improving VLSI Placement2024 IEEE 15th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)10.1109/UEMCON62879.2024.10754775(715-719)Online publication date: 17-Oct-2024
  • (2024)HWSim: Hardware Similarity Learning for Intellectual Property Piracy Detection2024 IEEE International Symposium on Circuits and Systems (ISCAS)10.1109/ISCAS58744.2024.10558324(1-5)Online publication date: 19-May-2024
  • (2024)PreRoutSGAT: Pre-Routing Timing Prediction Based on Graph Neural Network with Global Attention2024 6th International Conference on Circuits and Systems (ICCS)10.1109/ICCS62517.2024.10846089(310-315)Online publication date: 20-Sep-2024
  • (2024)LSTP : A Logic Synthesis Timing PredictorProceedings of the 29th Asia and South Pacific Design Automation Conference10.1109/ASP-DAC58780.2024.10473925(728-733)Online publication date: 22-Jan-2024
  • Show More Cited By

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media