Loading [MathJax]/extensions/MathMenu.js
Dynamic Graph Embedding via Self-Attention in the Lorentz Space | IEEE Conference Publication | IEEE Xplore

Dynamic Graph Embedding via Self-Attention in the Lorentz Space


Abstract:

Graph Neural Networks (GNNs) are popular for learning node representations in complex graph structures. Traditional methods use Euclidean space but struggle to capture hi...Show More

Abstract:

Graph Neural Networks (GNNs) are popular for learning node representations in complex graph structures. Traditional methods use Euclidean space but struggle to capture hierarchical structures in real-world graphs. Besides, it’s important to note that in practical applications, many graphs are dynamic and undergo continuous evolution over time. To investigate the characteristics of complex temporal networks, we have introduced a dynamic graph embedding model in the Lorentz space, building upon the foundation of the previously proposed DynHAT model. More specially, our model divides the dynamic graph into multiple discrete static graphs, maps each static graph to the Lorentz space, and then learns informative node representations over time using a self-attention mechanism. We have conducted link prediction experiments on two types of graphs: communication networks and rating networks. Through comprehensive experiments conducted on five real-world datasets, we have demonstrated the superiority of our model in embedding dynamic graphs within Lorentz space.
Date of Conference: 08-10 May 2024
Date Added to IEEE Xplore: 10 July 2024
ISBN Information:

ISSN Information:

Conference Location: Tianjin, China

Contact IEEE to Subscribe

References

References is not available for this document.