Skip to main content
Log in

Attributed network representation learning via improved graph attention with robust negative sampling

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Attributed network representation learning is to embed graphs in low dimensional vector space such that the embedded vectors follow the differences and similarities of the source graphs. To capture structural features and node attributes of attributed network, we propose a novel graph auto-encoder method which is stacked encoder-decoder layers based on graph attention with robust negative sampling. Here, minimize the negative log-likelihood, triplet distance, and weighted neighborhood attributes are proposed as the loss function. To alleviate the over-fitting on reconstruct graph structural features or node attributes, a trade off algorithm between reconstruction loss of node attributes and reconstruction loss of structural features is proposed. Furthermore, to alleviate the impact of random sampling, we propose additional constraints on negative sampling based on node degree. Experimental results on several benchmark datasets for transductive and inductive learning tasks show that the proposed model is competitive against well-known methods in node classification and link prediction.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5

Similar content being viewed by others

Notes

  1. Our implementation of the GARNS may be found at https://github.com/fanhl/GARNS

References

  1. Mercado P, Bosch J, Stoll M (2019) Node classification for signed social networks using diffuse interface methods. In: Joint European Conference on Machine Learning and Knowledge Discovery in Databases (pp. 524-540). Springer, Cham

  2. Tuan TM, Chuan PM, Ali M, Ngan TT, Mittal M (2019) Fuzzy and neutrosophic modeling for link prediction in social networks. Evolving Systems 10(4):629–634

    Article  Google Scholar 

  3. Zeng X, Wang W, Chen C, Yen GG (2019) A consensus community-based particle swarm optimization for dynamic community detection. IEEE transactions on cybernetics

  4. Dai Q, Li Q, Zhang L, Wang D (2019) Ranking Network Embedding via Adversarial Learning. In: Pacific-Asia Conference on Knowledge Discovery and Data Mining (pp. 27-39). Springer, Cham

  5. Li C, Guan D, Yuan W (2019) Network Embedding via Link Strength Adjusted Random Walk. In: Pacific Rim Knowledge Acquisition Workshop (pp. 163-172). Springer, Cham

  6. Jung J, Jin W, Kang U (2020) Random walk-based ranking in signed social networks: Model and algorithms. Knowl Inf Syst 62(2):571–610

    Article  Google Scholar 

  7. Ying R, He R, Chen K, Eksombatchai P, Hamilton WL, Kang U, Leskovec J (2018) Graph convolutional neural networks for web-scale recommender systems, pp 974–983

  8. Guan N, Song D, Liao L (2019) Knowledge graph embedding with concepts. Knowledge-Based Systems 164:38–44

    Article  Google Scholar 

  9. Sheikh N, Kefato Z, Montresor A (2019) Gat2vec: representation learning for attributed graphs. Computing 101(3):187–209

    Article  MathSciNet  Google Scholar 

  10. Wang R, Wang M, Liu J, Cochez M, Decker S (2019) Structured query construction via knowledge graph embedding. Knowl Inf Syst 62(5):1819–1846

    Article  Google Scholar 

  11. Chen H, Perozzi B, Hu Y, Skiena S (2018) Harp: Hierarchical representation learning for networks. In: Thirty-Second AAAI Conference on Artificial Intelligence

  12. Zamora-Resendiz R, Crivelli S (2019) Structural Learning of Proteins Using Graph Convolutional Neural Networks. bioRxiv, 610444

  13. Rossi RA, Zhou R, Ahmed N (2018) Deep inductive graph representation learning. IEEE Transactions on Knowledge and Data Engineering

  14. Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. arXiv:1609.02907

  15. Monti F, Boscaini D, Masci J, Rodola E, Svoboda J, Bronstein MM (2017) Geometric deep learning on graphs and manifolds using mixture model cnns. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp 5115–5124

  16. Zhou H, Young T, Huang M, Zhao H, Xu J, Zhu X (2018) Commonsense Knowledge Aware Conversation Generation with Graph Attention. In: IJCAI, pp 4623–4629

  17. Lou Y, Qian T, Li F, Ji D (2020) A Graph Attention Model for Dictionary-Guided Named Entity Recognition. IEEE Access 8:71584–71592

    Article  Google Scholar 

  18. Song W, Xiao Z, Wang Y, Charlin L, Zhang M, Tang J (2019) Session-based social recommendation via dynamic graph attention networks. In: Proceedings of the Twelfth ACM International Conference on Web Search and Data Mining, pp 555–563

  19. Veličković P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y (2017) Graph attention networks. arXiv:1710.10903

  20. Huang Z, Li X, Ye Y, Li F, Yao Y (2018) Tlvane: A two-level variation model for attributed network embedding. Neural Computing & Applications(6)

  21. Shen C, Zhao X, Fan X, Lian X, Zhang F, Kreidieh AR, Liu Z (2019) Multi-receptive field graph convolutional neural networks for pedestrian detection. IET Intelligent Transport Systems 13 (9):1319–1328

    Article  Google Scholar 

  22. Hong R, He Y, Wu L, Ge Y, Wu X (2019) Deep attributed network embedding by preserving structure and attribute information. IEEE Transactions on Systems, Man, and Cybernetics: Systems

  23. Pan S, Hu R, Long G, Jiang J, Yao L, Zhang C (2018) Adversarially regularized graph autoencoder for graph embedding. arXiv:1802.04407

  24. Ren Y, Liu B, Huang C, Dai P, Bo L, Zhang J (2019) Heterogeneous Deep Graph Infomax. arXiv:1911.08538

  25. Wang C, Pan S, Long G, Zhu X, Jiang J (2017) Mgae: Marginalized graph autoencoder for graph clustering. In: Proceedings of the 2017 ACM on Conference on Information and Knowledge Management, pp 889–898

  26. Salehi A, Davulcu H (2019) Graph Attention Auto-Encoders. arXiv:1905.10715

  27. Guo W, Cai J, Wang S (2020) Unsupervised discriminative feature representation via adversarial auto-encoder. Appl Intell 50(4):1155–1171

    Article  Google Scholar 

  28. Bojchevski A, Günnemann S (2017) Deep gaussian embedding of graphs: Unsupervised inductive learning via ranking. arXiv:1707.03815

  29. Zhang L, Liu Z, Wang L, Pu J (2019) Adaptive Graph Regularization Discriminant Nonnegative Matrix Factorization for Data Representation. IEEE Access 7:112756–112766

    Article  Google Scholar 

  30. Perozzi B, Al-Rfou R, Skiena S (2014) Deepwalk: Online learning of social representations. In: Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining, pp 701–710

  31. Grover A, Leskovec J (2016) Node2vec: Scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD international conference on Knowledge discovery and data mining, pp 855–864

  32. Lin CH, Wang SH, Lin CJ (2019) Using convolutional neural networks for character verification on integrated circuit components of printed circuit boards. Applied Intelligence 49(11):4022– 4032

    Article  Google Scholar 

  33. Yang Z, Cohen WW, Salakhutdinov R (2016) Revisiting semi-supervised learning with graph embeddings. arXiv:1603.08861

  34. Zhang D, Yin J, Zhu X, Zhang C (2019) Attributed network embedding via subspace discovery. Data Mining and Knowledge Discovery 33(6):1953–1980

    Article  Google Scholar 

  35. Galland A, Lelarge M (2019) Invariant embedding for graph classification. In: ICML 2019 Workshop on Learning and Reasoning with Graph-Structured Representations

  36. Bahonar H, Mirzaei A, Wilson R (2019) Graph Embedding Using Frequency Filtering. IEEE transactions on pattern analysis and machine intelligence

  37. Hasanzadeh A, Hajiramezanali E, Narayanan K, Duffield N, Zhou M, Qian X (2019) Semi-implicit graph variational auto-encoders. In: Advances in Neural Information Processing Systems, pp 10711–10722

  38. Zhang Z, Yang H, Bu J, Zhou S, Yu P, Zhang J, Wang C (2018) ANRL: Attributed Network Representation Learning via Deep Neural Networks. . In: IJCAI, 18, pp 3155–3161

  39. Wang G, Ying R, Huang J, Leskovec J (2019) Improving Graph Attention Networks with Large Margin-based Constraints. arXiv:1910.11945

  40. Gong L, Cheng Q (2019) Exploiting edge features for graph neural networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp 9211–9219

  41. Cai H, Zheng VW, Chang KCC (2018) A comprehensive survey of graph embedding: Problems, techniques, and applications. IEEE Transactions on Knowledge and Data Engineering 30(9):1616–1637

    Article  Google Scholar 

  42. Shankar Shanthamallu U, Thiagarajan JJ, Song H, Spanias A (2018) GrAMME:, Semi-Supervised Learning using Multi-layered Graph Attention Models. arXiv:1810.01405

  43. Horrocks T, Holden EJ, Wedge D, Wijns C, Fiorentini M (2019) Geochemical characterisation of rock hydration processes using t-SNE. Computers & geosciences 124:46–57

    Article  Google Scholar 

  44. Wang P, Zhao J, Zhang X, Tao J, Guan X (2019) SNOD: a fast sampling method of exploring node orbit degrees for large graphs. Knowledge and Information Systems 61(1):301–326

    Article  Google Scholar 

Download references

Acknowledgements

This work is supported by the Natural Science Foundation of Chongqing (No.cstc2019jscx-msxm0616), the Fundamental Research Funds for the Central Universities (No.2019CDCGTX302, No.2018CDPTCG000141) and the Major Natural Science Funds of Chongqing Education Commission(No.KJZD-M201901401)

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Huilian Fan or Yuanchang Zhong.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Fan, H., Zhong, Y., Zeng, G. et al. Attributed network representation learning via improved graph attention with robust negative sampling. Appl Intell 51, 416–426 (2021). https://doi.org/10.1007/s10489-020-01825-x

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-020-01825-x

Keywords

Navigation