Abstract:
Graph Neural Networks (GNNs) are popular for learning node representations in complex graph structures. Traditional methods use Euclidean space but struggle to capture hi...Show MoreMetadata
Abstract:
Graph Neural Networks (GNNs) are popular for learning node representations in complex graph structures. Traditional methods use Euclidean space but struggle to capture hierarchical structures in real-world graphs. Besides, it’s important to note that in practical applications, many graphs are dynamic and undergo continuous evolution over time. To investigate the characteristics of complex temporal networks, we have introduced a dynamic graph embedding model in the Lorentz space, building upon the foundation of the previously proposed DynHAT model. More specially, our model divides the dynamic graph into multiple discrete static graphs, maps each static graph to the Lorentz space, and then learns informative node representations over time using a self-attention mechanism. We have conducted link prediction experiments on two types of graphs: communication networks and rating networks. Through comprehensive experiments conducted on five real-world datasets, we have demonstrated the superiority of our model in embedding dynamic graphs within Lorentz space.
Published in: 2024 27th International Conference on Computer Supported Cooperative Work in Design (CSCWD)
Date of Conference: 08-10 May 2024
Date Added to IEEE Xplore: 10 July 2024
ISBN Information: