Skip to main content

Multi-head Attention Induced Dynamic Hypergraph Convolutional Networks

  • Conference paper
  • First Online:
Pattern Recognition and Computer Vision (PRCV 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14433))

Included in the following conference series:

  • 427 Accesses

Abstract

Hypergraph neural networks (HGNNs) have recently attracted much attention from researchers due to the powerful modeling ability. Existing HGNNs usually derive the data representation by capturing the high-order adjacent relations in a hypergraph. However, incomplete exploration and exploitation of hypergraph structure result in the deficiency of high-order relations among the samples. To this end, we propose a novel hypergraph convolutional networks (M-HGCN) to capture the latent structured properties in a hypergraph. Specifically, two novelty designs are proposed to enhance the expressive capability of HGNNs. (1) The CNN-like spatial graph convolution and self-adaptive hypergraph incidence matrix are employed to capture both the local and global structural properties in a hypergraph. (2) The dual-attention scheme is applied to hyperedges, which can model the interactions across multiple hyperedges to form hypergraph-aware features. The experimental results on the benchmark citation network datasets demonstrate the superior performance of the proposed method over the existing strong baselines.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bai, S., Zhang, F., Torr, P.H.: Hypergraph convolution and hypergraph attention. Pattern Recogn. 110, 107637 (2021)

    Article  Google Scholar 

  2. Bruna, J., Zaremba, W., Szlam, A., LeCun, Y.: Spectral networks and locally connected networks on graphs. arXiv preprint arXiv:1312.6203 (2013)

  3. Defferrard, M., Bresson, X., Vandergheynst, P.: Convolutional neural networks on graphs with fast localized spectral filtering. In: Advances in Neural Information Processing Systems, vol. 29 (2016)

    Google Scholar 

  4. Dong, Y., Sawin, W., Bengio, Y.: HNHN: hypergraph networks with hyperedge neurons. arXiv preprin arXiv:2006.12278 (2020)

  5. Feng, Y., You, H., Zhang, Z., Ji, R., Gao, Y.: Hypergraph neural networks. In: AAAI Conference on Artificial Intelligence, vol. 33, pp. 3558–3565 (2019)

    Google Scholar 

  6. Gao, H., Wang, Z., Ji, S.: Large-scale learnable graph convolutional networks. In: 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1416–1424 (2018)

    Google Scholar 

  7. Gao, Y., Feng, Y., Ji, S., Ji, R.: HGNN\(^{}+ \): general hypergraph neural networks. IEEE Trans. Pattern Anal. Mach. Intell. 45(3), 3181–3199 (2022)

    Google Scholar 

  8. Jiang, J., Wei, Y., Feng, Y., Cao, J., Gao, Y.: Dynamic hypergraph neural networks. In: 28th International Joint Conference on Artificial Intelligence, pp. 2635–2641 (2019)

    Google Scholar 

  9. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907 (2016)

  10. Tu, K., Cui, P., Wang, X., Wang, F., Zhu, W.: Structural deep embedding for hyper-networks. In: AAAI Conference on Artificial Intelligence, vol. 32, no. 1 (2018)

    Google Scholar 

  11. Veličković, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., Bengio, Y.: Graph attention networks. arXiv preprint arXiv:1710.10903 (2017)

  12. You, J., Ying, R., Leskovec, J.: Position-aware graph neural networks. In: International Conference on Machine Learning, pp. 7134–7143 (2019)

    Google Scholar 

  13. Zhang, M., Cui, Z., Jiang, S., Chen, Y.: Beyond link prediction: predicting hyperlinks in adjacency space. In: AAAI Conference on Artificial Intelligence, vol. 32, no. 1 (2018)

    Google Scholar 

  14. Zhang, R., Zou, Y., Ma, J.: Hyper-SAGNN: a self-attention based graph neural network for hypergraphs. arXiv preprint arXiv:1911.02613 (2019)

  15. Zhang, Z., Feng, Y., Ying, S., Gao, Y.: Deep hypergraph structure learning. arXiv preprint arXiv:2208.12547 (2022)

Download references

Acknowledgments

This work was supported by National Key R &D Program of China (No. 2022ZD0118202) and the National Natural Science Foundation of China (No. 62376101, No. 62072386).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Taisong Jin .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Peng, X., Lin, W., Jin, T. (2024). Multi-head Attention Induced Dynamic Hypergraph Convolutional Networks. In: Liu, Q., et al. Pattern Recognition and Computer Vision. PRCV 2023. Lecture Notes in Computer Science, vol 14433. Springer, Singapore. https://doi.org/10.1007/978-981-99-8546-3_21

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-8546-3_21

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-8545-6

  • Online ISBN: 978-981-99-8546-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics