Loading [a11y]/accessibility-menu.js
GraphNAS++: Distributed Architecture Search for Graph Neural Networks | IEEE Journals & Magazine | IEEE Xplore

GraphNAS++: Distributed Architecture Search for Graph Neural Networks


Abstract:

Graph neural networks (GNNs) are popularly used to analyze non-euclidean graph data. Despite their successes, the design of graph neural networks requires heavy manual wo...Show More

Abstract:

Graph neural networks (GNNs) are popularly used to analyze non-euclidean graph data. Despite their successes, the design of graph neural networks requires heavy manual work and rich domain knowledge. Recently, neural architecture search algorithms are widely used to automatically design neural architectures for CNNs and RNNs. Inspired by the success of neural architecture search algorithms, we present a graph neural architecture search algorithm GraphNAS that enables automatic design of the best graph neural architecture based on reinforcement learning. Specifically, GraphNAS uses a recurrent network as the controller to generate variable-length strings that describe the architectures of graph neural networks, and trains the recurrent network with policy gradient to maximize the expected accuracy of the generated architectures on a validation data set. Moreover, based on GraphNAS, we design a new GraphNAS++ model using distributed neural architecture search. Compared with GraphNAS that generates and evaluates only one candidate architecture at each iteration, GraphNAS++ generates a mini-batch of candidate architectures and evaluates them in a distributed computing environment until convergence. Experiments on real-world graph datasets demonstrate that GraphNAS can design a novel network architecture that rivals the best human-invented architecture in terms of accuracy. Moreover, GraphNAS++ can speed up the design process at least five times by using the distributed training framework with GPUs.
Published in: IEEE Transactions on Knowledge and Data Engineering ( Volume: 35, Issue: 7, 01 July 2023)
Page(s): 6973 - 6987
Date of Publication: 26 May 2022

ISSN Information:

Funding Agency:


References

References is not available for this document.