DURENDAL: Graph deep learning framework for temporal heterogeneous
networks
- URL: http://arxiv.org/abs/2310.00336v1
- Date: Sat, 30 Sep 2023 10:46:01 GMT
- Title: DURENDAL: Graph deep learning framework for temporal heterogeneous
networks
- Authors: Manuel Dileo, Matteo Zignani and Sabrina Gaito
- Abstract summary: Temporal heterogeneous networks (THNs) are evolving networks that characterize many real-world applications.
We propose DURENDAL, a graph deep learning framework for THNs.
- Score: 0.5156484100374057
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Temporal heterogeneous networks (THNs) are evolving networks that
characterize many real-world applications such as citation and events networks,
recommender systems, and knowledge graphs. Although different Graph Neural
Networks (GNNs) have been successfully applied to dynamic graphs, most of them
only support homogeneous graphs or suffer from model design heavily influenced
by specific THNs prediction tasks. Furthermore, there is a lack of temporal
heterogeneous networked data in current standard graph benchmark datasets.
Hence, in this work, we propose DURENDAL, a graph deep learning framework for
THNs. DURENDAL can help to easily repurpose any heterogeneous graph learning
model to evolving networks by combining design principles from snapshot-based
and multirelational message-passing graph learning models. We introduce two
different schemes to update embedding representations for THNs, discussing the
strengths and weaknesses of both strategies. We also extend the set of
benchmarks for TNHs by introducing two novel high-resolution temporal
heterogeneous graph datasets derived from an emerging Web3 platform and a
well-established e-commerce website. Overall, we conducted the experimental
evaluation of the framework over four temporal heterogeneous network datasets
on future link prediction tasks in an evaluation setting that takes into
account the evolving nature of the data. Experiments show the prediction power
of DURENDAL compared to current solutions for evolving and dynamic graphs, and
the effectiveness of its model design.
Related papers
- SIG: Efficient Self-Interpretable Graph Neural Network for Continuous-time Dynamic Graphs [34.269958289295516]
We aim to predict future links within the dynamic graph while simultaneously providing causal explanations for these predictions.
To tackle these challenges, we propose a novel causal inference model, namely the Independent and Confounded Causal Model (ICCM)
Our proposed model significantly outperforms existing methods across link prediction accuracy, explanation quality, and robustness to shortcut features.
arXiv Detail & Related papers (2024-05-29T13:09:33Z) - Graph Transformer GANs with Graph Masked Modeling for Architectural
Layout Generation [153.92387500677023]
We present a novel graph Transformer generative adversarial network (GTGAN) to learn effective graph node relations.
The proposed graph Transformer encoder combines graph convolutions and self-attentions in a Transformer to model both local and global interactions.
We also propose a novel self-guided pre-training method for graph representation learning.
arXiv Detail & Related papers (2024-01-15T14:36:38Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Analysis of different temporal graph neural network configurations on
dynamic graphs [0.0]
This project aims to address the gap in the literature by performing a qualitative analysis of spatial-temporal dependence structure learning on dynamic graphs.
An extensive ablation study will be conducted on different variants of the best-performing TGN to identify the key factors contributing to its performance.
By achieving these objectives, this project will provide valuable insights into the design and optimization of TGNs for dynamic graph analysis.
arXiv Detail & Related papers (2023-05-02T00:07:33Z) - Structure-reinforced Transformer for Dynamic Graph Representation Learning with Edge Temporal States [8.577434144370004]
We introduce a novel dynamic graph representation learning framework namely Recurrent Structure-reinforced Graph Transformer (RSGT)
RSGT initially models the temporal status of edges explicitly by utilizing different edge types and weights based on the differences between any two consecutive snapshots.
A structure-reinforced graph transformer is proposed to capture temporal node representations that encoding both the graph topological structure and evolving dynamics.
arXiv Detail & Related papers (2023-04-20T04:12:50Z) - PGCN: Progressive Graph Convolutional Networks for Spatial-Temporal Traffic Forecasting [4.14360329494344]
We propose a novel traffic forecasting framework called Progressive Graph Convolutional Network (PGCN)
PGCN constructs a set of graphs by progressively adapting to online input data during the training and testing phases.
The proposed model achieves state-of-the-art performance with consistency in all datasets.
arXiv Detail & Related papers (2022-02-18T02:15:44Z) - Handling Distribution Shifts on Graphs: An Invariance Perspective [77.14319095965058]
We formulate the OOD problem for node-level prediction on graphs.
We develop a new domain-invariant learning approach, named Explore-to-Extrapolate Risk Minimization.
We prove the validity of our method by theoretically showing its guarantee of a valid OOD solution.
arXiv Detail & Related papers (2022-02-05T02:31:01Z) - Anomaly Detection in Dynamic Graphs via Transformer [30.926884264054042]
We present a novel Transformer-based Anomaly Detection framework for DYnamic graph (TADDY)
Our framework constructs a comprehensive node encoding strategy to better represent each node's structural and temporal roles in an evolving graphs stream.
Our proposed TADDY framework outperforms the state-of-the-art methods by a large margin on four real-world datasets.
arXiv Detail & Related papers (2021-06-18T02:27:19Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.