skip to main content
10.1145/3372422.3372442acmotherconferencesArticle/Chapter ViewAbstractPublication PagesciisConference Proceedingsconference-collections
research-article

Hybrid High-order in Graph Attention Layer

Published: 07 February 2020 Publication History

Abstract

As a result of approximating the Eigenbasis of the graph Laplacian proposed by GC-layer of Kipf & Welling [5], the convolution operation is efficiently applied from Euclidean domain to graph domain, and the end-to-end deep graph neural network is widely used and developed. However, fixed neighborhood limits the learning ability of the model, and GAT [7] models global node pairs to avoid information loss. In the form, this modeling is equivalent to only considering the first-order proximity relation of the network, which leads to the indirect and lossy transmission of the higher-order information of the network, even if the multi-layer attention mechanism is used to expand the order of the network. In order to avoid the above situation and obtain higher-order information better, this paper tries to establish the concept of higher-order neighborhood mixed learning of graphs. In our work, unlike the implicit propagation of neighborhood information through activation functions in the past, our model called H-GAT explicitly obtain the information of high-order neighborhood of nodes, and use attention mechanism to model different weights between high-step nodes.

References

[1]
K.Q. Weinberger; L.K. Saul. 2004. Unsupervised learning of image manifolds by semidefinite programming. In Proceedings of the 2004 IEEE.
[2]
B. Perozzi, R. Al-Rfou, and S. Skiena. 2014. DeepWalk: Online Learning of Social Representations. In Knowledge Discovery and Data Mining.Ding, W. and Marchionini, G.
[3]
T. Mikolov, L. Sutskever, K. Chen, G. Corrado, J. Dean. 2013. Distributed Representations of Words and Phrases and their Compositionality. In Advances in Neural Information Processing Systems (NIPS).
[4]
J. Tang, M. Qu, M. Wang, M. Zhang, J. Yan, and Q. Mei. 2015. Line: Large-scale information network embedding. In Proceedings of the 24th International Conference on World Wide Web.
[5]
T. Kipf and M. Welling. 2017. Semi-Supervised Classification with Graph Convolutional Networks. In International Conference on Learning Representations.
[6]
M. Defferrard, X. Bresson, and P. Vandergheynst. 2016. Convolutional Neural Networks on Graphs with Fast Localized Spectral Filtering. In Advances in Neural Information Processing Systems (NIPS).
[7]
P. Velickovic, G. Cucurull, A. Casanova, A. Romero, P. Lio, and Y. Bengio. 2018. Graph attention networks. in Proceedings of the 7th International Conference on Learning Representations.
[8]
W. Hamilton, R. Ying, and J. Leskovec. 2017. Inductive Representation Learning on Large Graphs. In Advances in Neural Information Processing Systems (NIPS)
[9]
M. Belkin, P. Niyogi, and V. Sindhwani. 2006. Manifold regularization: A geometric framework for learning from labeled and unlabeled examples. In Journal of machine learning research (JMLR).
[10]
J. Weston, F. Ratle, H. Mobahi, and R. Collobert. 2012. Deeplearning via semi-supervised embedding. In Neural Networks: Tricks of the Trade. 639--655.
[11]
X.J. Zhu, Z.B. Ghahramani, and J. Lafferty. 2003. Semi-supervised learning using gaussian fields and harmonic functions. In International Conference on Machine Learning (ICML).
[12]
Q. Lu and L. Getoor. 2003. Link-based classification. In International Conference on Machine Learning (ICML).
[13]
Z. Yang, W. Cohen, and R. Salakhutdinov. 2016. Revisiting Semi-Supervised Learning with Graph Embeddings. In International Conference on Machine Learning (ICML).
[14]
A. Grover and J. Leskovec. 2016. Node2vec: Scalable feature learning for networks. In KDD.
[15]
R. Ying, R. He, K. Chen, P. Eksombatchai, W. L. Hamilton, and J. Leskovec. 2018. Graph convolutional neural networks for web-scale recommender systems. in Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining.
[16]
Z.Q. Liu, C.C. Chen, L.F. Li, J. Zhou, X.L. Li, L. Song, and Y. Qi. 2018. GeniePath: Graph Neural Networks with Adaptive Receptive Paths. In KDD.
[17]
V. Ashish, N. Shazeer, and N. Parmar, et al. 2017. Attention Is All You Need. In Advances in Neural Information Processing Systems (NIPS).
[18]
L. Tang and H. Liu. Relational learning via latent social dimensions. In Proceedings of the 15th ACM SIGKDD.

Recommendations

Comments

Information & Contributors

Information

Published In

cover image ACM Other conferences
CIIS '19: Proceedings of the 2019 2nd International Conference on Computational Intelligence and Intelligent Systems
November 2019
200 pages
ISBN:9781450372596
DOI:10.1145/3372422
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

In-Cooperation

  • Queensland University of Technology
  • City University of Hong Kong: City University of Hong Kong

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 07 February 2020

Permissions

Request permissions for this article.

Check for updates

Author Tags

  1. Attention
  2. Explicit information flow
  3. GC-layer
  4. High-order
  5. Hybrid learning

Qualifiers

  • Research-article
  • Research
  • Refereed limited

Conference

CIIS 2019

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 51
    Total Downloads
  • Downloads (Last 12 months)1
  • Downloads (Last 6 weeks)0
Reflects downloads up to 15 Feb 2025

Other Metrics

Citations

View Options

Login options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Figures

Tables

Media

Share

Share

Share this Publication link

Share on social media