T-GAN: A deep learning framework for prediction of temporal complex networks with adaptive graph convolution and attention mechanism☆
Introduction
Complex network is a graph network with non-trivial topological features, which often occurs in real systems, such as Internet, social networks, sensor networks, transportation networks and utility networks. Study of complex networks is still a relatively new but active research area. In the early research on complex networks, the main focus is on the analysis and modeling the features of complex networks. More specifically, the small world [1] and scale-free [2] characteristics of complex networks have been deeply explored and proved theoretically. In addition, the research problems of community detection [3] and link prediction [4] are widely investigated for complex networks. With more extreme weathers and advanced large scale cyber-security attacks, the vulnerability of real complex networks has never been exposed as now and the cost of complex network breaking down could be very high. There is increasing demand on the prediction of complex network states and events, which can provide important support for intelligent decision making on the control and management of complex networks. However, prediction of complex networks is very challenging due to the large scale and complex interaction among the network nodes. The research problem of prediction of complex networks has been rarely studied in the literature.
Many real applications on complex network can be modeled as a dual problem of network topology prediction or network node feature forecasting. For example, as shown in Fig. 1, e-commerce recommendation can be modeled as a topology prediction problem in a user-commodity bipartite graph [5], in which the purchasing behaviors of consumers will be recorded at different time step ( and ) that constitute the time sequence of the bipartite graph, and taking advantage of the historical information from the time sequence to make prediction for the future orders (connections between consumers and items at time step ) is the core objective in recommendation systems. Malicious node detection in wireless sensor networks (WSN) [6] can be applied to the network lifetime assessment for sensor networks [7], and risk forecasting for finance companies [8] can be considered as the node classification task in complex networks, as is shown in Fig. 2. In Fig. 2, there was only one malicious node in the network at time step , with another 2 nodes infected soon at . Hence, there is an urgent need to use the time series to study the evolution pattern of the infection; making prediction for the next time step is of vital importance.
Topology evolution in social networks can be formulated as the link prediction problem for undirected or directed weighted graphs, as shown in Fig. 3, where the original snapshot of the dynamic network is presented as time step , the blue dotted line in the graph at indicates that the link between node 1 and node 2 is broken, and the solid black line denotes that a new link presents between node 2 and node 5. The topology prediction task for this temporal network shown in Fig. 3 includes predictions for weights or links of each node pair of the graph at time step
For the real complex networks, along with complex network topology, another prominent feature of temporal evolution makes the network prediction and control problems more challenging. The network topology and the properties of the networks nodes and links could all change significantly with time. In most of the existing research works on complex network, static complex networks have been studied, while temporally evolving complex networks received much less study.
In view of the above research gaps and challenges on the study of prediction of complex networks properties and events, we are motivated to propose a novel end-to-end deep learning based network model, which is called temporal graph convolution and attention (T-GAN) for predicting property and events of temporal complex networks. An encoder-decoder framework is applied to achieve the objectives. As long short-term memory (LSTM) has demonstrated the ability in extracting temporal features and has achieved huge successes in deep learning based computer vision and natural language processing, we apply it to our proposed model for joint extraction of both spatial and temporal features of complex networks. As GCN adopts symmetric Laplacian matrix as the graph shift operator in frequency domain, it has been proved that GCN has the property of low-pass filtering [9], which directly affects the depth of GCN based deep learning model [10]. Therefore, we propose Adaptive Graph Convolution Network (AGCN) and integrated it with LSTM cells, which provides T-GAN stronger ability in extracting features of complex networks. Furthermore, a dual attention mechanism is introduced to improve the sensitivity of the model to each time step of temporal networks. After generating the embedded representation vectors for the nodes in dynamic complex networks, a neural network with single layer serves as the decoder, in which nonlinear affine transformation is used as the mapping function. Compared with the fully connected neural network, the number of the parameters of the decoder designed in T-GAN can be greatly reduced due to parameter sharing, which plays a significant role in reducing training time and avoiding over-fitting.
The main contributions of this paper are summarized as follows:
- •
We proposed a novel deep learning based network model (T-GAN) with an encoder-decoder framework for the purpose of prediction of properties and events of complex networks. The model can be used for the control and management of complex networks, with many real applications.
- •
For the encoder of the model T-GAN, we propose new adaptive graph convolution network and integrate it with LSTM cells, which provides T-GAN strong ability in extracting features of complex network and greatly improves the scalability of the model. And we applied attention blocks to help capture the importance and uniqueness of different time steps in complex networks and enhance the sensitivity of the model.
- •
Our proposed T-GAN architecture is general and scalable, which can be used for a wide range of real applications. We demonstrate the applications of T-GAN to three prediction tasks for evolving complex networks, namely, node classification, feature forecasting and topology prediction over 6 open datasets. Our T-GAN based approach significantly outperforms the existing, achieving improvement of more than 4.7% in recall and 25.1% in precision.
- •
We also conducted additional experiments to evaluate the generalization of the proposed model on learning the characteristic of time-series images. Extensive experiments demonstrate the effectiveness of T-GAN in learning spatial and temporal feature and predicting properties for complex networks.
The remainder of the paper is organized as follows. Section 2 presents the related works. Section 3 presents the problem definition and notations. Section 4 introduces the proposed architecture in detail. Extensive experimental results and analysis are shown in Section 5, and additional experiments are presented in Section 6. Finally, conclusions are drawn in Section 7.
Section snippets
Literature review
Complex network is a relatively new research area. The main focus of complex network in the early stage is on the analysis and modeling of complex networks. Graph embedding (graph representation learning) is an important research branch of deep learning applied in complex networks [11], which takes advantage of the powerful non-linear fitting capability of the deep neural network that maps high-dimensional sparse networks to dense vectors in a lower-dimensional space. It is convenient to
Problem definition and notations
Temporal complex network can be denoted by a time sequence{}, where is a sampling snapshot at time step t which is a triple comprised of vertices (nodes), edge set and node features, i.e., . {} denotes the set of vertices in the network. denotes the set of edges connecting the network nodes at time step t. is the node feature matrix at the sampling time t, where is the number of the nodes in the network and is the dimension of node features.
System framework of T-GAN
In order to capture the evolution pattern of topology and node features of temporal networks, we propose a deep learning network model T-GAN with adaptive graph convolution and attention mechanism. T-GAN is a deep neural network model consisting of an encoder and decoder, which integrates the topological structure of complex networks with the extensive feature information of vertices for learning and modeling the evolutionary property of temporal networks. Stacked-LSTM layers are used as the
Experiments and analysis
In order to verify the effectiveness of the proposed model, we conduct various experiments and analysis of the model for 3 applications over 6 open temporal complex network datasets. We also compare our approach with the current baseline methods. All experiments are run on the server equipped with two Nvida RTX 3060. The deep learning model is built by Keras 2.1.4. The version of TensorFlow on the server is TensorFlow-GPU 1.14.0, and the CUDA version is 10.1. The code of the T-GAN deep learning
Additional experiments beyond graph
In this section, we will visualize the beneficial effect that the attention module shows on the proposed model through additional experiments, and evaluate the feasibility of applying the revised T-GAN to the image reconstruction and prediction task.
Conclusion
Prediction of properties and events of complex networks is of increasing importance but has not been well studied. In this paper, we investigated the prediction of properties and events of complex networks with temporally changing features, and proposed a novel deep learning based model T-GAN for the challenging prediction task. In the T-GAN network model, adaptive graph convolution network was proposed and integrated with LSTM to extract the spatial characteristics of nodes and learn the
CRediT authorship contribution statement
Ru Huang: Conceptualization, Methodology. Lei Ma: Software, Data curation, Writing - original draft. Jianhua He: Visualization, Investigation. Xiaoli Chu: Supervision, Validation, Writing - review & editing.
Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.
Acknowledgment
This work was supported by National Natural Science Foundation of China, grant number 61673178 and 61922063, Natural Science Foundation of Shanghai, No.20ZR1413800. Our work has also received funding from the European Union’s Horizon 2020 research and innovation programme under the Marie Sklodowska-Curie grant agreement No 824019 (project COSAFE) and the Marie Sklodowska-Curie grant agreement No 101022280 (project VESAFE). The authors would like to thank the anonymous reviewers for their
Ru Huang received his B.S. degree from Nanjing University, Nanjing, China, in 1999 and his Ph.D. degree in circuit and system from Shanghai Jiao Tong University, Shanghai, China, in 2008. He was a visiting scholar in the University of Wisconsin-Madison, WI, USA, from March 31, 2015 to March 30, 2016. He is currently an associate professor of electronics and communication engineering at East China University of Science and Technology, Shanghai, China. His current research interests include
References (56)
- et al.
Complex propagation on directed small world networks
Phys. A Stat. Mech. Its Appl.
(2010) - et al.
Interaction between epidemic spread and collective behavior in scale-free networks with community structure
J. Theor. Biol.
(2019) - et al.
Community detection in complex networks with an ambiguous structure using central node based link prediction
Knowledge-Based Syst.
(2020) - et al.
Projection-based link prediction in a bipartite network
Inf. Sci. (Ny)
(2017) - et al.
Multi-kernel Gaussian process latent variable regression model for high-dimensional sequential data modeling
Neurocomputing.
(2019) - et al.
E-LSTM-D: A deep learning framework for dynamic network link prediction
IEEE Trans. Syst. Man Cybern. Syst.
(2019) - et al.
Extended geometric models for stereoscopic 3D with vertical screen disparity
Displays
(2020) - et al.
Perceptual-based quality assessment for audio-visual services: A survey
Signal Process. Image Commun
(2010) - et al.
Evolutionary network embedding preserving both local proximity and community structure
IEEE Trans. Evol. Comput.
(2020) - et al.
Detection of good and bad sensor nodes in the presence of malicious attacks and its application to data aggregation
IEEE Trans. Signal Inf. Process. over Networks.
(2018)
Stealthy attacks in wireless ad hoc networks: detection and countermeasure
IEEE Trans. Mob. Comput.
CENDA: Camouflage event based malicious node detection architecture
IEEE 6th Int Conf. Mob. Adhoc Sens. Syst.
Learning graph embedding with adversarial training methods
IEEE Trans. Cybern.
Parallelizing Word2Vec in shared and distributed memory
IEEE Trans. Parallel Distrib. Syst.
Random-walk computation of similarities between nodes of a graph with application to collaborative recommendation
IEEE Trans. Knowl. Data Eng.
Word2Vec
Nat. Lang. Eng.
Variational learning for switching state-space models
Neural Comput.
Link prediction in dynamic social networks by integrating different types of information
Appl. Intell.
DynGEM: deep embedding method for dynamic graphs
IJCAI Int. Work. Represent. Learn. Graphs
Cited by (14)
LRB-Net: Improving VQA via division of labor strategy and multimodal classifiers
2022, DisplaysCitation Excerpt :The key idea is to communicate between image regions to build contextual representations. Ru Huang et al. [21] proposed a temporal graph convolution and attention network (T-GAN). They integrate the graph convolution network (GCN) with LSTM cells so that the complex network’s ability to extract features is enormously improved.
TERQA: Question answering over knowledge graph considering precise dependencies of temporal information on vectors
2022, DisplaysCitation Excerpt :It is well known that the temporal constraints on natural language questions are closely linked to the relevant subcomponents of the questions themselves. Previous studies [7] usually spliced temporal information embeddings with other text embeddings in a simple way. In the case of complex questions, the dependency of temporal information on specific components is ignored, resulting in a wealth of uncertain associations, further constraining the accuracy of retrieving the answers.
Ru Huang received his B.S. degree from Nanjing University, Nanjing, China, in 1999 and his Ph.D. degree in circuit and system from Shanghai Jiao Tong University, Shanghai, China, in 2008. He was a visiting scholar in the University of Wisconsin-Madison, WI, USA, from March 31, 2015 to March 30, 2016. He is currently an associate professor of electronics and communication engineering at East China University of Science and Technology, Shanghai, China. His current research interests include wireless sensor networks, complex networks and software-defined networks. ([email protected])
Lei Ma received his B.S. degree in information engineering from the University of East China University of Science and Technology, in 2018.And he is now pursing his M.S. degree, majoring in Information and Communication Engineering at East China University of Science and Technology. And his Research directions comprises complex networks, deep learning, and data mining, specializing at building deep learning model for data mining tasks in complex networks. ([email protected])
Jianhua He received the Ph.D. degree from Nanyang Technological University, Singapore, in 2002. He is a Reader at University of Essex, U.K. His research interests include 5G and beyond technologies, ADAS, Internet of Things, machine learning and big data analytics. He has authored or co-authored over 100 technical papers in major international journals and conferences. Dr. He is Editor or Guest Editor of several international journals, including Wireless Communications and Mobile Computing, IEEE Access, International Journals of Communication Systems, International Journal of Distributed Sensor Networks, and KSII Transactions on Internet and Information Systems. He acted as TPC chair for international conference ICAIT’2010 and ICONI’2010. He is a TPC member of many international conferences including IEEE Globecom and IEEE ICC. ([email protected])
Xiaoli Chu received her B.Eng. degree in electronic and information engineering from Xi'an Jiao Tong University in 2001 and her Ph.D. degree in electrical and electronic engineering from Hong Kong University of Science and Technology in 2005. From September 2005 to April 2012, she was with the Centre for Telecommunications Research at King's College London. Her research interests include modelling, analysis, and algorithm design for improving the performance and efficiency of wireless communication systems. ([email protected])
- ☆
This paper was recommended for publication by Prof G Guangtao Zhai.