Abstract:
The increasing complexity and density of graphs present significant challenges in the training of Graph Neural Networks (GNNs), often resulting in a substantial consumpti...Show MoreMetadata
Abstract:
The increasing complexity and density of graphs present significant challenges in the training of Graph Neural Networks (GNNs), often resulting in a substantial consumption of computational resources. In response to this challenge, it becomes imperative to consider techniques for graph sparsification, among which graph pruning emerges as an effective method. Since Simulated Annealing(SA) is particularly beneficial due to its ability to circumvent local optima and effectively navigate the solution space. In this work, after pruning the graph, we employed SA to incrementally explore and identify the optimal graph structure. Empirical results demonstrate that there exists sub-graphs derived from original graphs and some edges are redundant in fact and after removing them, our GNN can obtain better performance.
Date of Conference: 30 June 2024 - 05 July 2024
Date Added to IEEE Xplore: 09 September 2024
ISBN Information: