Abstract
Hypergraph neural networks (HGNNs) have recently attracted much attention from researchers due to the powerful modeling ability. Existing HGNNs usually derive the data representation by capturing the high-order adjacent relations in a hypergraph. However, incomplete exploration and exploitation of hypergraph structure result in the deficiency of high-order relations among the samples. To this end, we propose a novel hypergraph convolutional networks (M-HGCN) to capture the latent structured properties in a hypergraph. Specifically, two novelty designs are proposed to enhance the expressive capability of HGNNs. (1) The CNN-like spatial graph convolution and self-adaptive hypergraph incidence matrix are employed to capture both the local and global structural properties in a hypergraph. (2) The dual-attention scheme is applied to hyperedges, which can model the interactions across multiple hyperedges to form hypergraph-aware features. The experimental results on the benchmark citation network datasets demonstrate the superior performance of the proposed method over the existing strong baselines.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bai, S., Zhang, F., Torr, P.H.: Hypergraph convolution and hypergraph attention. Pattern Recogn. 110, 107637 (2021)
Bruna, J., Zaremba, W., Szlam, A., LeCun, Y.: Spectral networks and locally connected networks on graphs. arXiv preprint arXiv:1312.6203 (2013)
Defferrard, M., Bresson, X., Vandergheynst, P.: Convolutional neural networks on graphs with fast localized spectral filtering. In: Advances in Neural Information Processing Systems, vol. 29 (2016)
Dong, Y., Sawin, W., Bengio, Y.: HNHN: hypergraph networks with hyperedge neurons. arXiv preprin arXiv:2006.12278 (2020)
Feng, Y., You, H., Zhang, Z., Ji, R., Gao, Y.: Hypergraph neural networks. In: AAAI Conference on Artificial Intelligence, vol. 33, pp. 3558–3565 (2019)
Gao, H., Wang, Z., Ji, S.: Large-scale learnable graph convolutional networks. In: 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1416–1424 (2018)
Gao, Y., Feng, Y., Ji, S., Ji, R.: HGNN\(^{}+ \): general hypergraph neural networks. IEEE Trans. Pattern Anal. Mach. Intell. 45(3), 3181–3199 (2022)
Jiang, J., Wei, Y., Feng, Y., Cao, J., Gao, Y.: Dynamic hypergraph neural networks. In: 28th International Joint Conference on Artificial Intelligence, pp. 2635–2641 (2019)
Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016)
Tu, K., Cui, P., Wang, X., Wang, F., Zhu, W.: Structural deep embedding for hyper-networks. In: AAAI Conference on Artificial Intelligence, vol. 32, no. 1 (2018)
Veličković, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., Bengio, Y.: Graph attention networks. arXiv preprint arXiv:1710.10903 (2017)
You, J., Ying, R., Leskovec, J.: Position-aware graph neural networks. In: International Conference on Machine Learning, pp. 7134–7143 (2019)
Zhang, M., Cui, Z., Jiang, S., Chen, Y.: Beyond link prediction: predicting hyperlinks in adjacency space. In: AAAI Conference on Artificial Intelligence, vol. 32, no. 1 (2018)
Zhang, R., Zou, Y., Ma, J.: Hyper-SAGNN: a self-attention based graph neural network for hypergraphs. arXiv preprint arXiv:1911.02613 (2019)
Zhang, Z., Feng, Y., Ying, S., Gao, Y.: Deep hypergraph structure learning. arXiv preprint arXiv:2208.12547 (2022)
Acknowledgments
This work was supported by National Key R &D Program of China (No. 2022ZD0118202) and the National Natural Science Foundation of China (No. 62376101, No. 62072386).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Peng, X., Lin, W., Jin, T. (2024). Multi-head Attention Induced Dynamic Hypergraph Convolutional Networks. In: Liu, Q., et al. Pattern Recognition and Computer Vision. PRCV 2023. Lecture Notes in Computer Science, vol 14433. Springer, Singapore. https://doi.org/10.1007/978-981-99-8546-3_21
Download citation
DOI: https://doi.org/10.1007/978-981-99-8546-3_21
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-8545-6
Online ISBN: 978-981-99-8546-3
eBook Packages: Computer ScienceComputer Science (R0)