iLoRE: Dynamic Graph Representation with Instant Long-term Modeling and
Re-occurrence Preservation
- URL: http://arxiv.org/abs/2309.02012v1
- Date: Tue, 5 Sep 2023 07:48:52 GMT
- Title: iLoRE: Dynamic Graph Representation with Instant Long-term Modeling and
Re-occurrence Preservation
- Authors: Siwei Zhang, Yun Xiong, Yao Zhang, Xixi Wu, Yiheng Sun and Jiawei
Zhang
- Abstract summary: We present iLoRE, a novel dynamic graph modeling method with instant node-wise Long-term modeling and Re-occurrence preservation.
Our experimental results on real-world datasets demonstrate the effectiveness of our iLoRE for dynamic graph modeling.
- Score: 21.15310868951046
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Continuous-time dynamic graph modeling is a crucial task for many real-world
applications, such as financial risk management and fraud detection. Though
existing dynamic graph modeling methods have achieved satisfactory results,
they still suffer from three key limitations, hindering their scalability and
further applicability. i) Indiscriminate updating. For incoming edges, existing
methods would indiscriminately deal with them, which may lead to more time
consumption and unexpected noisy information. ii) Ineffective node-wise
long-term modeling. They heavily rely on recurrent neural networks (RNNs) as a
backbone, which has been demonstrated to be incapable of fully capturing
node-wise long-term dependencies in event sequences. iii) Neglect of
re-occurrence patterns. Dynamic graphs involve the repeated occurrence of
neighbors that indicates their importance, which is disappointedly neglected by
existing methods. In this paper, we present iLoRE, a novel dynamic graph
modeling method with instant node-wise Long-term modeling and Re-occurrence
preservation. To overcome the indiscriminate updating issue, we introduce the
Adaptive Short-term Updater module that will automatically discard the useless
or noisy edges, ensuring iLoRE's effectiveness and instant ability. We further
propose the Long-term Updater to realize more effective node-wise long-term
modeling, where we innovatively propose the Identity Attention mechanism to
empower a Transformer-based updater, bypassing the limited effectiveness of
typical RNN-dominated designs. Finally, the crucial re-occurrence patterns are
also encoded into a graph module for informative representation learning, which
will further improve the expressiveness of our method. Our experimental results
on real-world datasets demonstrate the effectiveness of our iLoRE for dynamic
graph modeling.
Related papers
- SelfGNN: Self-Supervised Graph Neural Networks for Sequential Recommendation [15.977789295203976]
We propose a novel framework called Self-Supervised Graph Neural Network (SelfGNN) for sequential recommendation.
The SelfGNN framework encodes short-term graphs based on time intervals and utilizes Graph Neural Networks (GNNs) to learn short-term collaborative relationships.
Our personalized self-augmented learning structure enhances model robustness by mitigating noise in short-term graphs based on long-term user interests and personal stability.
arXiv Detail & Related papers (2024-05-31T14:53:12Z) - Gradient Transformation: Towards Efficient and Model-Agnostic Unlearning for Dynamic Graph Neural Networks [66.70786325911124]
Graph unlearning has emerged as an essential tool for safeguarding user privacy and mitigating the negative impacts of undesirable data.
With the increasing prevalence of DGNNs, it becomes imperative to investigate the implementation of dynamic graph unlearning.
We propose an effective, efficient, model-agnostic, and post-processing method to implement DGNN unlearning.
arXiv Detail & Related papers (2024-05-23T10:26:18Z) - Signed Graph Neural Ordinary Differential Equation for Modeling
Continuous-time Dynamics [13.912268915939656]
The prevailing approach of integrating graph neural networks with ordinary differential equations has demonstrated promising performance.
We introduce a novel approach: a signed graph neural ordinary differential equation, adeptly addressing the limitations of miscapturing signed information.
Our proposed solution boasts both flexibility and efficiency.
arXiv Detail & Related papers (2023-12-18T13:45:33Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Instant Graph Neural Networks for Dynamic Graphs [18.916632816065935]
We propose Instant Graph Neural Network (InstantGNN), an incremental approach for the graph representation matrix of dynamic graphs.
Our method avoids time-consuming, repetitive computations and allows instant updates on the representation and instant predictions.
Our model achieves state-of-the-art accuracy while having orders-of-magnitude higher efficiency than existing methods.
arXiv Detail & Related papers (2022-06-03T03:27:42Z) - EIGNN: Efficient Infinite-Depth Graph Neural Networks [51.97361378423152]
Graph neural networks (GNNs) are widely used for modelling graph-structured data in numerous applications.
Motivated by this limitation, we propose a GNN model with infinite depth, which we call Efficient Infinite-Depth Graph Neural Networks (EIGNN)
We show that EIGNN has a better ability to capture long-range dependencies than recent baselines, and consistently achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-02-22T08:16:58Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Efficient Dynamic Graph Representation Learning at Scale [66.62859857734104]
We propose Efficient Dynamic Graph lEarning (EDGE), which selectively expresses certain temporal dependency via training loss to improve the parallelism in computations.
We show that EDGE can scale to dynamic graphs with millions of nodes and hundreds of millions of temporal events and achieve new state-of-the-art (SOTA) performance.
arXiv Detail & Related papers (2021-12-14T22:24:53Z) - Causal Incremental Graph Convolution for Recommender System Retraining [89.25922726558875]
Real-world recommender system needs to be regularly retrained to keep with the new data.
In this work, we consider how to efficiently retrain graph convolution network (GCN) based recommender models.
arXiv Detail & Related papers (2021-08-16T04:20:09Z) - TG-GAN: Continuous-time Temporal Graph Generation with Deep Generative
Models [9.75258136573147]
We propose a new model, called Temporal Graph Generative Adversarial Network'' (TG-GAN) for continuous-time temporal graph generation.
We first propose a novel temporal graph generator that jointly model truncated edge sequences, time budgets, and node attributes.
In addition, a new temporal graph discriminator is proposed, which combines time and node encoding operations over a recurrent architecture to distinguish the generated sequences.
arXiv Detail & Related papers (2020-05-17T17:59:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.