ABSTRACT
In this work, graph neural networks (GNNs) and transfer learning are leveraged to transfer device sizing knowledge learned from data of related analog circuit topologies to predict the performance of a new topology. A graph is generated from the netlist of a circuit, with nodes representing the devices and edges the connections between devices. To allow for the simultaneous training of GNNs on data of multiple topologies, graph isomorphism networks are adopted to address the limitation of graph convolutional networks in distinguishing between different graph structures. The techniques are applied to transfer predictions of performance across four op-amp topologies in a 65 nm technology, with 10000 sets of sizing and performance evaluations sampled for each circuit. Two scenarios, zero-shot learning and few-shot learning, are considered based on the availability of data in the target domain. Results from the analysis indicate that zero-shot learning with GNNs trained on all the data of the three related topologies is effective for coarse estimates of the performance of the fourth unseen circuit without requiring any data from the fourth circuit. Few-shot learning by fine-tuning the GNNs with a small dataset of 100 points from the target topology after pre-training on data from the other three topologies further boosts the model performance. The fine-tuned GNNs outperform the baseline artificial neural networks (ANNs) trained on the same dataset of 100 points from the target topology with an average reduction in the root-mean-square error of 70.6%. Applying the proposed techniques, specifically GNNs and transfer learning, improves the sample efficiency of the performance models of the analog ICs through the transfer of predictions across related circuit topologies.
Supplemental Material
- M. Barros, J. Guilherme, and N. Horta, "Analog Circuits Optimization Based on Evolutionary Computation Techniques," Integration, the VLSI Journal, Vol. 43, No. 1, pp. 136--155, Jan. 2010.Google ScholarDigital Library
- H. Wang, K. Wang, J. Yang, L. Shen, N. Sun, H. Lee, and H. Song, "GCN-RL Circuit Designer: Transferable Transistor Sizing with Graph Neural Networks and Reinforcement Learning," Proceedings of the IEEE/ACM Design Automation Conference, pp. 1--6, Jul. 2020.Google Scholar
- Z. Wu and I. Savidis, "CALT: Classification with Adaptive Labeling Thresholds for Analog Circuit Sizing," Proceedings of the ACM/IEEE Workshop on Machine Learning for CAD, pp. 49--54, Nov. 2020.Google ScholarDigital Library
- Z. Wu and I. Savidis, "Variation-aware Analog Circuit Sizing with Classifier Chains," Proceedings of the ACM/IEEE Workshop on Machine Learning for CAD, pp. 1--6, Sep. 2021.Google Scholar
- M. Hussain, J. J. Bird, and D. R. Faria, "A Study on CNN Transfer Learning for Image Classification," Proceedings of the UK Workshop on Computational Intelligence, pp. 191--202, Aug. 2018.Google Scholar
- S. J. Pan and Q. Yang, "A Survey on Transfer Learning," IEEE Transactions on Knowledge and Data Engineering, Vol. 22, No. 10, pp. 1345--1359, Oct. 2010.Google ScholarDigital Library
- Z. Wu and I. Savidis, "Transfer Learning for Reuse of Analog Circuit Sizing Models Across Technology Nodes," IEEE International Symposium on Circuits and Systems, pp. 1--5, Jun. 2022.Google Scholar
- G. Zhang, H. He, and D. Katabi, "Circuit-GNN: Graph Neural Networks for Distributed Circuit Design," Proceedings of Machine Learning Research, Vol. 97, No. 1, pp. 7364--7373, Jun. 2019.Google Scholar
- K. Kunal, T. Dhar, M. Madhusudan, J. Poojary, A. Sharma, W. Xu, S. M. Burns, J. Hu, R. Harjani, and S. S. Sapatnekar, "GANA: Graph Convolutional Network Based Automated Netlist Annotation for Analog Circuits," Proceedings of the Design, Automation and Test in Europe Conference, pp. 55--60, Mar. 2020.Google ScholarCross Ref
- H. Ren, G. F. Kokai, W. J. Turner, and T. Ku, "ParaGraph: Layout Parasitics and Device Parameter Prediction Using Graph Neural Networks," Proceedings of the IEEE/ACM Design Automation Conference, pp. 1--6, Nov. 2020.Google Scholar
- Y. Li, Y. Lin, M. Madhusudan, A. Sharma, W. Xu, S. S. Sapatnekar, R. Harjani, and J. Hu, "A Customized Graph Neural Network Model for Guiding Analog IC Placement," Proceedings of the International Conference On Computer Aided Design, pp. 1--9, Nov. 2020.Google Scholar
- Q. Li, Z.o Han, and X. Wu, "Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning," Proceedings of the Conference on Artificial Intelligence, pp. 3538--3545, Feb. 2018.Google Scholar
- T. N. Kipf and M. Welling, "Semi-supervised Classification with Graph Convolutional Networks," Proceedings of the International Conference on Learning Representations, pp. 1--14, Feb. 2017.Google Scholar
- K. Xu, W. Hu, J. Leskovec, and S. Jegelka, "How Powerful are Graph Neural Networks?," Proceedings of the International Conference on Learning Representations, pp. 1--17, May 2019.Google Scholar
- W. L. Hamilton, R. Ying, and J. Leskovec, "Inductive Representation Learning on Large Graphs," Proceedings of the International Conference on Neural Information Processing Systems, pp. 1025--1035, Dec. 2017.Google Scholar
- J. Yosinski, J. Clune, Y. Bengio, and H. Lipson, "A Survey on Deep Transfer Learning," Proceedings of the Conference on Neural Information Processing Systems, pp. 3320--3328, Dec. 2014.Google Scholar
- K. Bernstein et al., "High-performance CMOS Variability in the 65-nm Regime and Beyond," IBM Journal of Research and Development, Vol. 50, No. 4.5, pp. 433--449, Jul. 2006.Google ScholarDigital Library
- M. Mckay, R. Beckman, and W. Conover, "A Comparison of Three Methods for Selecting Values of Input Variables in the Analysis of Output from a Computer Code," Technometrics, Vol. 21, No. 2, pp. 239--245, May 1979.Google Scholar
- Z. Abu-Aisheh, R. Raveaux, J. Ramel, and P. Martineau, "An Exact Graph Edit Distance Algorithm for Solving Pattern Recognition Problems," Proceedings of the International Conference on Pattern Recognition Applications and Methods, pp. 271--278, Jan. 2015.Google ScholarDigital Library
Index Terms
- Transfer of Performance Models Across Analog Circuit Topologies with Graph Neural Networks
Recommendations
Informative pseudo-labeling for graph neural networks with few labels
AbstractGraph neural networks (GNNs) have achieved state-of-the-art results for semi-supervised node classification on graphs. Nevertheless, the challenge of how to effectively learn GNNs with very few labels is still under-explored. As one of the ...
Adaptive Transfer Learning on Graph Neural Networks
KDD '21: Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery & Data MiningGraph neural networks (GNNs) is widely used to learn a powerful representation of graph-structured data. Recent work demonstrates that transferring knowledge from self-supervised tasks to downstream tasks could further improve graph representation. ...
Label-Consistency based Graph Neural Networks for Semi-supervised Node Classification
SIGIR '20: Proceedings of the 43rd International ACM SIGIR Conference on Research and Development in Information RetrievalGraph neural networks (GNNs) achieve remarkable success in graph-based semi-supervised node classification, leveraging the information from neighboring nodes to improve the representation learning of target node. The success of GNNs at node ...
Comments