ABSTRACT
Graph Neural Networks (GNNs) have revolutionized graph learning through efficiently learned node embeddings and achieved promising results in various graph-related tasks such as node and graph classification. Within GNNs, a pooling operation reduces the size of the input graph by grouping nodes that share commonalities intending to generate more robust and expressive latent representations. For this reason, pooling is a critical operation that significantly affects downstream tasks. Existing global pooling methods mostly use readout functions like max or sum to perform the pooling operations, but these methods neglect the hierarchical information of graphs. Clique-based hierarchical pooling methods have recently been developed to overcome global pooling issues. Such clique pooling methods perform a hard partition between nodes, which destroys the topological structural relationship of nodes, assuming that a node should belong to a single cluster. However, overlapping clusters widely exist in many real-world networks since a node can belong to more than one cluster. Here we introduce a new hierarchical graph pooling method to address this issue. Our pooling method, named Quasi-CliquePool, builds on the concept of a quasi-clique, which generalizes the notion of cliques to extract dense incomplete subgraphs of a graph. We also introduce a soft peel-off strategy to find the overlapping cluster nodes to keep the topological structural relationship of nodes. For a fair comparison, we follow the same procedure and training settings used by state-of-the-art pooling techniques. Our experiments demonstrate that combining the Quasi-Clique Pool with existing GNN architectures yields an average improvement of 2% accuracy on four out of six graph classification benchmarks compared to other existing pooling methods.
- Davide Bacciu, Alessio Conte, Roberto Grossi, Francesco Landolfi, and Andrea Marino. 2021. K-plex cover pooling for graph neural networks. Data Mining and Knowledge Discovery 35, 5 (2021), 2200--2220.Google ScholarDigital Library
- Filippo Maria Bianchi, Daniele Grattarola, and Cesare Alippi. 2020. Spectral clustering with graph neural networks for graph pooling. In International Conference on Machine Learning. PMLR, 874--883.Google Scholar
- Karsten M Borgwardt, Cheng Soon Ong, Stefan Schönauer, SVN Vishwanathan, Alex J Smola, and Hans-Peter Kriegel. 2005. Protein function prediction via graph kernels. Bioinformatics 21, suppl_1 (2005), i47--i56.Google ScholarDigital Library
- Michael M Bronstein, Joan Bruna, Yann LeCun, Arthur Szlam, and Pierre Vandergheynst. 2017. Geometric deep learning: going beyond euclidean data. IEEE Signal Processing Magazine 34, 4 (2017), 18--42.Google ScholarCross Ref
- Lei Cai, Jundong Li, Jie Wang, and Shuiwang Ji. 2021. Line graph neural networks for link prediction. IEEE Transactions on Pattern Analysis and Machine Intelligence (2021).Google ScholarCross Ref
- Michaël Defferrard, Xavier Bresson, and Pierre Vandergheynst. 2016. Convolutional neural networks on graphs with fast localized spectral filtering. Advances in neural information processing systems 29 (2016).Google Scholar
- Inderjit S Dhillon, Yuqiang Guan, and Brian Kulis. 2007. Weighted graph cuts without eigenvectors a multilevel approach. IEEE transactions on pattern analysis and machine intelligence 29, 11 (2007), 1944--1957.Google ScholarDigital Library
- Paul D Dobson and Andrew J Doig. 2003. Distinguishing enzyme structures from non-enzymes without alignments. Journal of molecular biology 330, 4 (2003), 771--783.Google ScholarCross Ref
- Aasa Feragen, Niklas Kasenburg, Jens Petersen, Marleen de Bruijne, and Karsten Borgwardt. 2013. Scalable kernels for graphs with continuous attributes. Advances in neural information processing systems 26 (2013).Google Scholar
- Hongyang Gao and Shuiwang Ji. 2019. Graph u-nets. In international conference on machine learning. PMLR, 2083--2092.Google Scholar
- Justin Gilmer, Samuel S Schoenholz, Patrick F Riley, Oriol Vinyals, and George E Dahl. 2017. Neural message passing for quantum chemistry. In International conference on machine learning. PMLR, 1263--1272.Google Scholar
- Arousha Haghighian Roudsari, Jafar Afshar, Wookey Lee, and Suan Lee. 2022. PatentNet: multi-label classification of patent documents using deep learning based language understanding. Scientometrics 127, 1 (2022), 207--231.Google ScholarDigital Library
- Will Hamilton, Zhitao Ying, and Jure Leskovec. 2017. Inductive representation learning on large graphs. Advances in neural information processing systems 30 (2017).Google Scholar
- William L Hamilton, Rex Ying, and Jure Leskovec. 2017. Representation learning on graphs: Methods and applications. arXiv preprint arXiv:1709.05584 (2017).Google Scholar
- Kristian Kersting, Nils M Kriege, Christopher Morris, Petra Mutzel, and Marion Neumann. 2016. Benchmark data sets for graph kernels. (2016).Google Scholar
- Thomas N Kipf and Max Welling. 2016. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016).Google Scholar
- Kamran Kowsari, Donald E Brown, Mojtaba Heidarysafa, Kiana Jafari Meimandi, Matthew S Gerber, and Laura E Barnes. 2017. Hdltex: Hierarchical deep learning for text classification. In 2017 16th IEEE international conference on machine learning and applications (ICMLA). IEEE, 364--371.Google ScholarCross Ref
- Junhyun Lee, Inyeop Lee, and Jaewoo Kang. 2019. Self-attention graph pooling. In International conference on machine learning. PMLR, 3734--3743.Google Scholar
- Xinyu Lei, Hongguang Pan, and Xiangdong Huang. 2019. A dilated CNN model for image classification. IEEE Access 7 (2019), 124087--124095.Google ScholarCross Ref
- Guohao Li, Matthias Muller, Ali Thabet, and Bernard Ghanem. 2019. Deep-gcns: Can gcns go as deep as cnns?. In Proceedings of the IEEE/CVF international conference on computer vision. 9267--9276.Google ScholarCross Ref
- Y Li, D Tarlow, M Brockschmidt, and R Zemel. 2016. Gated graph sequence neural networks In: International Conference on Learning Representations. San Juan (2016).Google Scholar
- Enxhell Luzhnica, Ben Day, and Pietro Lio. 2019. Clique pooling for graph classification. arXiv preprint arXiv:1904.00374 (2019).Google Scholar
- Yao Ma, Suhang Wang, Charu C Aggarwal, and Jiliang Tang. 2019. Graph convolutional networks with eigenpooling. In Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery & data mining. 723--731.Google ScholarDigital Library
- Shervin Minaee, Yuri Y Boykov, Fatih Porikli, Antonio J Plaza, Nasser Kehtarnavaz, and Demetri Terzopoulos. 2021. Image segmentation using deep learning: A survey. IEEE transactions on pattern analysis and machine intelligence (2021).Google ScholarCross Ref
- Christopher Morris, Nils M Kriege, Franka Bause, Kristian Kersting, Petra Mutzel, and Marion Neumann. 2020. Tudataset: A collection of benchmark datasets for learning with graphs. arXiv preprint arXiv:2007.08663 (2020).Google Scholar
- Christopher Morris, Martin Ritzert, Matthias Fey, William L Hamilton, Jan Eric Lenssen, Gaurav Rattan, and Martin Grohe. 2019. Weisfeiler and leman go neural: Higher-order graph neural networks. In Proceedings of the AAAI conference on artificial intelligence, Vol. 33. 4602--4609.Google ScholarDigital Library
- Marcello Pelillo and Andrea Torsello. 2006. Payoff-monotonic game dynamics and the maximum clique problem. Neural Computation 18, 5 (2006), 1215--1258.Google ScholarDigital Library
- Sungmin Rhee, Seokjun Seo, and Sun Kim. 2017. Hybrid approach of relation network and localized graph convolutional filtering for breast cancer subtype classification. arXiv preprint arXiv:1711.05859 (2017).Google Scholar
- Kaspar Riesen and Horst Bunke. 2008. IAM graph database repository for graph based pattern recognition and machine learning. In Joint IAPR International Workshops on Statistical Techniques in Pattern Recognition (SPR) and Structural and Syntactic Pattern Recognition (SSPR). Springer, 287--297.Google ScholarDigital Library
- Franco Scarselli, Marco Gori, Ah Chung Tsoi, Markus Hagenbuchner, and Gabriele Monfardini. 2008. The graph neural network model. IEEE transactions on neural networks 20, 1 (2008), 61--80.Google ScholarDigital Library
- Martin Simonovsky and Nikos Komodakis. 2017. Dynamic edge-conditioned filters in convolutional neural networks on graphs. In Proceedings of the IEEE conference on computer vision and pattern recognition. 3693--3702.Google ScholarCross Ref
- Petar Velickovic, Guillem Cucurull, Arantxa Casanova, Adriana Romero, Pietro Lio, and Yoshua Bengio. 2017. Graph attention networks. stat 1050 (2017), 20.Google Scholar
- Nikil Wale, Ian A Watson, and George Karypis. 2008. Comparison of descriptor spaces for chemical compound retrieval and classification. Knowledge and Information Systems 14, 3 (2008), 347--375.Google ScholarDigital Library
- Xiang Wang, Buyue Qian, and Ian Davidson. 2014. On constrained spectral clustering and its applications. Data Mining and Knowledge Discovery 28, 1 (2014), 1--30.Google ScholarDigital Library
- Zhengyang Wang and Shuiwang Ji. 2020. Second-order pooling for graph neural networks. IEEE Transactions on Pattern Analysis and Machine Intelligence (2020).Google ScholarDigital Library
- Max Welling and Thomas N Kipf. 2016. Semi-supervised classification with graph convolutional networks. In J. International Conference on Learning Representations (ICLR 2017).Google Scholar
- Keyulu Xu, Weihua Hu, Jure Leskovec, and Stefanie Jegelka. 2018. How powerful are graph neural networks? arXiv preprint arXiv:1810.00826 (2018).Google Scholar
- Zhitao Ying, Jiaxuan You, Christopher Morris, Xiang Ren, Will Hamilton, and Jure Leskovec. 2018. Hierarchical graph representation learning with differentiable pooling. Advances in neural information processing systems 31 (2018).Google Scholar
- Lihi Zelnik-Manor and Pietro Perona. 2004. Self-tuning spectral clustering. Advances in neural information processing systems 17 (2004).Google Scholar
- Muhan Zhang, Zhicheng Cui, Marion Neumann, and Yixin Chen. 2018. An end-to-end deep learning architecture for graph classification. In Proceedings of the AAAI conference on artificial intelligence, Vol. 32.Google ScholarCross Ref
- Zhen Zhang, Jiajun Bu, Martin Ester, Jianfeng Zhang, Chengwei Yao, Zhi Yu, and Can Wang. 2019. Hierarchical graph pooling with structure learning. arXiv preprint arXiv:1911.05954 (2019).Google Scholar
- Zhen Zhang, Hongxia Yang, Jiajun Bu, Sheng Zhou, Pinggang Yu, Jianwei Zhang, Martin Ester, and Can Wang. 2018. ANRL: attributed network representation learning via deep neural networks.. In Ijcai, Vol. 18. 3155--3161.Google Scholar
Index Terms
- Quasi-CliquePool: Hierarchical Graph Pooling for Graph Classification
Recommendations
Graph Pooling via Coarsened Graph Infomax
SIGIR '21: Proceedings of the 44th International ACM SIGIR Conference on Research and Development in Information RetrievalGraph pooling that summaries the information in a large graph into a compact form is essential in hierarchical graph representation learning. Existing graph pooling methods either suffer from high computational complexity or cannot capture the global ...
Multivariate time-series classification with hierarchical variational graph pooling
AbstractIn recent years, multivariate time-series classification (MTSC) has attracted considerable attention owing to the advancement of sensing technology. Existing deep-learning-based MTSC techniques, which mostly rely on convolutional or ...
Masked Graph Auto-Encoder Constrained Graph Pooling
Machine Learning and Knowledge Discovery in DatabasesAbstractThe node drop pooling is a significant type of graph pooling that is required for learning graph-level representations. However, existing node drop pooling models still suffer from the information loss problem, impairing their effectiveness in ...
Comments