Skip to main content

GNAT: Leveraging Weighted Negative Sampling for Improved Graph Attention Network Performance

  • Conference paper
  • First Online:
Advanced Intelligent Computing Technology and Applications (ICIC 2023)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 14089))

Included in the following conference series:

Abstract

Graph Neural Networks (GNNs) have been considered effective tools for graph representation learning, among which Graph Attention Networks (GAT) has gained much attention since GAT assigns different weights to different nodes to generate a new node representation. In most of GNNs including GAT, the first-order adjacent nodes are only used to message passing, which can be considered as positive samples. However, many non-adjacent nodes, which can be seen as negative samples, draw less attention in graph learning. Although the current works have proposed several methods for negative sampling, they treat all negative samples to have the same weights when learning nodes' representation, which limits the learning ability of the model and could introduce irrelevant information. In this paper, we distinguish the importance of different negative samples by giving samples variant weights through the attention mechanism and proposed Graph Negative enhanced Attention Network (GNAT). Specifically, we first select appropriate negative samples for each node. Then, we utilize the multi-head attention mechanism to let negative samples have different weights. In this way, when learning node representations, GNAT can discriminate the significance of negative samples and reduce the influence of irrelevant information. Experimental evaluations show that appropriated negative samples can enhance the overall performance of the GAT model and GNAT obtained outstanding performance compared with SOTA methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

References

  1. Bhatt, D., Patel, C., Modi, K., Pandya, S., Ghayvat, H.: CNN variants for computer vision: history, architecture, application, challenges and future scope (2021)

    Google Scholar 

  2. Brody, S., Alon, U., Yahav, E.: How attentive are graph attention networks? In: The Tenth International Conference on Learning Representations, ICLR 2022 (2022)

    Google Scholar 

  3. Bruna, J., Zaremba, W., Szlam, A., LeCun, Y.: Spectral networks and locally connected networks on graphs. In: Bengio, Y., LeCun, Y. (eds.) 2nd International Conference on Learning Representations, ICLR (2014)

    Google Scholar 

  4. Chen, D., Lin, Y., Li, W., Li, P., Zhou, J., Sun, X.: Measuring and relieving the over-smoothing problem for graph neural networks from the topological view. In: The Thirty-Fourth AAAI Conference on Artificial Intelligence, AAAI, pp. 3438–3445 (2020)

    Google Scholar 

  5. Chen, J., Zhu, J., Song, L.: Stochastic training of graph convolutional networks with variance reduction. In: Dy, J.G., Krause, A. (eds.) Proceedings of the 35th International Conference on Machine Learning, ICML (2018)

    Google Scholar 

  6. Chen, J., Ma, T., Xiao, C.: Fastgcn: Fast learning with graph convolutional networks via importance sampling. In: 6th International Conference on Learning Representations, ICLR (2018)

    Google Scholar 

  7. Defferrard, M., Bresson, X., Vandergheynst, P.: Convolutional neural networks on graphs with fast localized spectral filtering. In: Lee, D.D., Sugiyama, M., von Luxburg, U., Guyon, I., Garnett, R. (eds.) Advances in Neural Information Processing Systems, vol. 29, pp. 3837–3845 (2016)

    Google Scholar 

  8. Duan, W.: Graph Convolutional Neural Networks with Negative Sampling. Ph.D. thesis (2022). http://hdl.handle.net/10453/162104

  9. Duan, W., Xuan, J., Qiao, M., Lu, J.: Learning from the dark: Boosting graph convolutional neural networks with diverse negative samples. In: Thirty-Sixth AAAI Conference on Artificial Intelligence, AAAI (2022)

    Google Scholar 

  10. Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: Precup, D., Teh, Y.W. (eds.) Proceedings of the 34th International Conference on Machine Learning, ICML, vol. 70, pp. 1263–1272. PMLR (2017)

    Google Scholar 

  11. Guo, S., Lin, Y., Feng, N., Song, C., Wan, H.: Attention based spatial-temporal graph convolutional networks for traffic flow forecasting. In: The Thirty-Third AAAI Conference on Artificial Intelligence, AAAI, pp. 922–929 (2019)

    Google Scholar 

  12. Hadji, I., Wildes, R.P.: What do we understand about convolutional networks? CoRR (2018)

    Google Scholar 

  13. Hamilton, W.L., Ying, Z., Leskovec, J.: Inductive representation learning on large graphs. In: Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems (2017)

    Google Scholar 

  14. Henaff, M., Bruna, J., LeCun, Y.: Deep convolutional networks on graph-structured data. CoRR (2015)

    Google Scholar 

  15. Kim, D., Oh, A.: How to find your friendly neighborhood: graph attention design with self-supervision. In: 9th International Conference on Learning Representations, ICLR (2021)

    Google Scholar 

  16. Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. In: International Conference on Learning Representations, ICLR (2017)

    Google Scholar 

  17. Kulesza, A., Taskar, B.: Determinantal point processes for machine learning. Found. Trends Mach. Learn. 5(2–3), 123–286 (2012)

    Article  MATH  Google Scholar 

  18. Ma, J., Cui, P., Kuang, K., Wang, X., Zhu, W.: Disentangled graph convolutional networks. In: Chaudhuri, K., Salakhutdinov, Proceedings of the 36th International Conference on Machine Learning, ICML. Proceedings of Machine Learning Research (2019)

    Google Scholar 

  19. Page, L., Brin, S., Motwani, R., Winograd, T.: The page rank citation ranking: bringing order to the web (1998)

    Google Scholar 

  20. Sen, P., Namata, G., Bilgic, M., Getoor, L., Gallagher, B., Eliassi-Rad, T.: Collective classification in network data. AI Mag. 29(3), 93–106 (2008)

    Google Scholar 

  21. Velickovic, P., Cucurull, G., Casanova, A., Romero, A., Liò, P., Bengio, Y.: Graph attention networks. In: 6th International Conference on Learning Representations, ICLR (2018)

    Google Scholar 

  22. Wang, H., et al.: MCNE: an end-to-end framework for learning multiple conditional network representations of social network. In: Teredesai, A., Kumar, V., Li, Y., Rosales, R., Terzi, E., Karypis, G. (eds.) Proceedings of the 25th ACM (2019)

    Google Scholar 

  23. Wu, Z., Pan, S., Chen, F., Long, G., Zhang, C., Yu, P.S.: A comprehensive survey on graph neural networks. IEEE Trans. Neural Netw. Learn. Syst. 32(1), 4–24 (2021)

    Article  MathSciNet  Google Scholar 

  24. Wu, Z., Pan, S., Long, G., Jiang, J., Zhang, C.: Graph wavenet for deep spatial-temporal graph modeling. In: Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence, IJCAI, pp. 1907–1913 (2019)

    Google Scholar 

  25. Xu, K., Hu, W., Leskovec, J., Jegelka, S.: How powerful are graph neural networks? In: 7th International Conference on Learning Representations, ICLR (2019)

    Google Scholar 

  26. Yang, Z., Ding, M., Zhou, C., Yang, H., Zhou, J., Tang, J.: Understanding negative sampling in graph representation learning. In: KDD ’20: The 26th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (2020)

    Google Scholar 

  27. Yang, Z., Cohen, W.W., Salakhutdinov, R.: Revisiting semi-supervised learning with graph embeddings. In: Balcan, M., Weinberger, K.Q. (eds.) Proceedings of the 33nd International Conference on Machine Learning, ICML (2016)

    Google Scholar 

  28. Ying, R., He, R., Chen, K., Eksombatchai, P., Hamilton, W.L., Leskovec, J.: Graph convolutional neural networks for web-scale recommender systems. In: Proceedings of the 24th International Conference on Knowledge Discovery & Data Mining, KDD (2018)

    Google Scholar 

  29. Yu, B., Yin, H., Zhu, Z.: Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting. In: Lang, J. (ed.) Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, IJCAI 2018, pp. 3634–3640 (2018)

    Google Scholar 

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China (K204101210002).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jeffrey Zheng .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Lu, Y., Wang, Q., Zhou, W., Zheng, J. (2023). GNAT: Leveraging Weighted Negative Sampling for Improved Graph Attention Network Performance. In: Huang, DS., Premaratne, P., Jin, B., Qu, B., Jo, KH., Hussain, A. (eds) Advanced Intelligent Computing Technology and Applications. ICIC 2023. Lecture Notes in Computer Science(), vol 14089. Springer, Singapore. https://doi.org/10.1007/978-981-99-4752-2_34

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-4752-2_34

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-4751-5

  • Online ISBN: 978-981-99-4752-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics