skip to main content
10.1145/3551901.3556488acmconferencesArticle/Chapter ViewAbstractPublication PagesmlcadConference Proceedingsconference-collections
research-article
Public Access

Transfer of Performance Models Across Analog Circuit Topologies with Graph Neural Networks

Published: 12 September 2022 Publication History

Abstract

In this work, graph neural networks (GNNs) and transfer learning are leveraged to transfer device sizing knowledge learned from data of related analog circuit topologies to predict the performance of a new topology. A graph is generated from the netlist of a circuit, with nodes representing the devices and edges the connections between devices. To allow for the simultaneous training of GNNs on data of multiple topologies, graph isomorphism networks are adopted to address the limitation of graph convolutional networks in distinguishing between different graph structures. The techniques are applied to transfer predictions of performance across four op-amp topologies in a 65 nm technology, with 10000 sets of sizing and performance evaluations sampled for each circuit. Two scenarios, zero-shot learning and few-shot learning, are considered based on the availability of data in the target domain. Results from the analysis indicate that zero-shot learning with GNNs trained on all the data of the three related topologies is effective for coarse estimates of the performance of the fourth unseen circuit without requiring any data from the fourth circuit. Few-shot learning by fine-tuning the GNNs with a small dataset of 100 points from the target topology after pre-training on data from the other three topologies further boosts the model performance. The fine-tuned GNNs outperform the baseline artificial neural networks (ANNs) trained on the same dataset of 100 points from the target topology with an average reduction in the root-mean-square error of 70.6%. Applying the proposed techniques, specifically GNNs and transfer learning, improves the sample efficiency of the performance models of the analog ICs through the transfer of predictions across related circuit topologies.

Supplementary Material

MP4 File (MLCAD22-061.mp4)
A description of the work "Transfer of Performance Models Across Analog Circuit Topologies with Graph Neural Networks" is provided in the video. The background and motivation of the research are discussed, followed by a discussion of the techniques of GNN and transfer learning. The characterization results of applying the techniques to transfer predictions across four op-amp topologies are analyzed and conclusions are provided. With zero-shot learning, GNNs provide coarse estimates of the circuit performance, resulting in less test errors than the baseline ANNs for 14 of the 20 cases. With few-shot learning, GNNs provide an average reduction of 70.6% in test error (RMSE) as compared to baseline ANNs. The proposed techniques provide a means to improve sample efficiency of simulation-based sizing methods.

References

[1]
M. Barros, J. Guilherme, and N. Horta, "Analog Circuits Optimization Based on Evolutionary Computation Techniques," Integration, the VLSI Journal, Vol. 43, No. 1, pp. 136--155, Jan. 2010.
[2]
H. Wang, K. Wang, J. Yang, L. Shen, N. Sun, H. Lee, and H. Song, "GCN-RL Circuit Designer: Transferable Transistor Sizing with Graph Neural Networks and Reinforcement Learning," Proceedings of the IEEE/ACM Design Automation Conference, pp. 1--6, Jul. 2020.
[3]
Z. Wu and I. Savidis, "CALT: Classification with Adaptive Labeling Thresholds for Analog Circuit Sizing," Proceedings of the ACM/IEEE Workshop on Machine Learning for CAD, pp. 49--54, Nov. 2020.
[4]
Z. Wu and I. Savidis, "Variation-aware Analog Circuit Sizing with Classifier Chains," Proceedings of the ACM/IEEE Workshop on Machine Learning for CAD, pp. 1--6, Sep. 2021.
[5]
M. Hussain, J. J. Bird, and D. R. Faria, "A Study on CNN Transfer Learning for Image Classification," Proceedings of the UK Workshop on Computational Intelligence, pp. 191--202, Aug. 2018.
[6]
S. J. Pan and Q. Yang, "A Survey on Transfer Learning," IEEE Transactions on Knowledge and Data Engineering, Vol. 22, No. 10, pp. 1345--1359, Oct. 2010.
[7]
Z. Wu and I. Savidis, "Transfer Learning for Reuse of Analog Circuit Sizing Models Across Technology Nodes," IEEE International Symposium on Circuits and Systems, pp. 1--5, Jun. 2022.
[8]
G. Zhang, H. He, and D. Katabi, "Circuit-GNN: Graph Neural Networks for Distributed Circuit Design," Proceedings of Machine Learning Research, Vol. 97, No. 1, pp. 7364--7373, Jun. 2019.
[9]
K. Kunal, T. Dhar, M. Madhusudan, J. Poojary, A. Sharma, W. Xu, S. M. Burns, J. Hu, R. Harjani, and S. S. Sapatnekar, "GANA: Graph Convolutional Network Based Automated Netlist Annotation for Analog Circuits," Proceedings of the Design, Automation and Test in Europe Conference, pp. 55--60, Mar. 2020.
[10]
H. Ren, G. F. Kokai, W. J. Turner, and T. Ku, "ParaGraph: Layout Parasitics and Device Parameter Prediction Using Graph Neural Networks," Proceedings of the IEEE/ACM Design Automation Conference, pp. 1--6, Nov. 2020.
[11]
Y. Li, Y. Lin, M. Madhusudan, A. Sharma, W. Xu, S. S. Sapatnekar, R. Harjani, and J. Hu, "A Customized Graph Neural Network Model for Guiding Analog IC Placement," Proceedings of the International Conference On Computer Aided Design, pp. 1--9, Nov. 2020.
[12]
Q. Li, Z.o Han, and X. Wu, "Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning," Proceedings of the Conference on Artificial Intelligence, pp. 3538--3545, Feb. 2018.
[13]
T. N. Kipf and M. Welling, "Semi-supervised Classification with Graph Convolutional Networks," Proceedings of the International Conference on Learning Representations, pp. 1--14, Feb. 2017.
[14]
K. Xu, W. Hu, J. Leskovec, and S. Jegelka, "How Powerful are Graph Neural Networks?," Proceedings of the International Conference on Learning Representations, pp. 1--17, May 2019.
[15]
W. L. Hamilton, R. Ying, and J. Leskovec, "Inductive Representation Learning on Large Graphs," Proceedings of the International Conference on Neural Information Processing Systems, pp. 1025--1035, Dec. 2017.
[16]
J. Yosinski, J. Clune, Y. Bengio, and H. Lipson, "A Survey on Deep Transfer Learning," Proceedings of the Conference on Neural Information Processing Systems, pp. 3320--3328, Dec. 2014.
[17]
K. Bernstein et al., "High-performance CMOS Variability in the 65-nm Regime and Beyond," IBM Journal of Research and Development, Vol. 50, No. 4.5, pp. 433--449, Jul. 2006.
[18]
M. Mckay, R. Beckman, and W. Conover, "A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output from a Computer Code," Technometrics, Vol. 21, No. 2, pp. 239--245, May 1979.
[19]
Z. Abu-Aisheh, R. Raveaux, J. Ramel, and P. Martineau, "An Exact Graph Edit Distance Algorithm for Solving Pattern Recognition Problems," Proceedings of the International Conference on Pattern Recognition Applications and Methods, pp. 271--278, Jan. 2015.

Cited By

View all
  • (2023)Circuit-GNN: A Graph Neural Network for Transistor-level Modeling of Analog Circuit Hierarchies2023 IEEE International Symposium on Circuits and Systems (ISCAS)10.1109/ISCAS46773.2023.10181617(1-5)Online publication date: 21-May-2023

Index Terms

  1. Transfer of Performance Models Across Analog Circuit Topologies with Graph Neural Networks

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      MLCAD '22: Proceedings of the 2022 ACM/IEEE Workshop on Machine Learning for CAD
      September 2022
      181 pages
      ISBN:9781450394864
      DOI:10.1145/3551901
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Sponsors

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 12 September 2022

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. analog circuit sizing
      2. graph neural networks
      3. performance modeling of analog ICs
      4. transfer learning

      Qualifiers

      • Research-article

      Funding Sources

      Conference

      MLCAD '22
      Sponsor:
      MLCAD '22: 2022 ACM/IEEE Workshop on Machine Learning for CAD
      September 12 - 13, 2022
      Virtual Event, China

      Acceptance Rates

      Overall Acceptance Rate 35 of 83 submissions, 42%

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)149
      • Downloads (Last 6 weeks)8
      Reflects downloads up to 15 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2023)Circuit-GNN: A Graph Neural Network for Transistor-level Modeling of Analog Circuit Hierarchies2023 IEEE International Symposium on Circuits and Systems (ISCAS)10.1109/ISCAS46773.2023.10181617(1-5)Online publication date: 21-May-2023

      View Options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Login options

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media