TIGGER: Scalable Generative Modelling for Temporal Interaction Graphs
- URL: http://arxiv.org/abs/2203.03564v2
- Date: Tue, 8 Mar 2022 12:13:07 GMT
- Title: TIGGER: Scalable Generative Modelling for Temporal Interaction Graphs
- Authors: Shubham Gupta, Sahil Manchanda, Srikanta Bedathur and Sayan Ranu
- Abstract summary: Existing generative models do not scale with the time horizon or the number of nodes.
In this paper, we bridge these gaps with a novel generative model called TIGGER.
We establish TIGGER generates graphs of superior fidelity, while also being up to 3 orders of magnitude faster than the state-of-the-art.
- Score: 19.71442902979904
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: There has been a recent surge in learning generative models for graphs. While
impressive progress has been made on static graphs, work on generative modeling
of temporal graphs is at a nascent stage with significant scope for
improvement. First, existing generative models do not scale with either the
time horizon or the number of nodes. Second, existing techniques are
transductive in nature and thus do not facilitate knowledge transfer. Finally,
due to relying on one-to-one node mapping from source to the generated graph,
existing models leak node identity information and do not allow
up-scaling/down-scaling the source graph size. In this paper, we bridge these
gaps with a novel generative model called TIGGER. TIGGER derives its power
through a combination of temporal point processes with auto-regressive modeling
enabling both transductive and inductive variants. Through extensive
experiments on real datasets, we establish TIGGER generates graphs of superior
fidelity, while also being up to 3 orders of magnitude faster than the
state-of-the-art.
Related papers
- IFH: a Diffusion Framework for Flexible Design of Graph Generative Models [53.219279193440734]
Graph generative models can be classified into two prominent families: one-shot models, which generate a graph in one go, and sequential models, which generate a graph by successive additions of nodes and edges.
This paper proposes a graph generative model, called Insert-Fill-Halt (IFH), that supports the specification of a sequentiality degree.
arXiv Detail & Related papers (2024-08-23T16:24:40Z) - Deep Prompt Tuning for Graph Transformers [55.2480439325792]
Fine-tuning is resource-intensive and requires storing multiple copies of large models.
We propose a novel approach called deep graph prompt tuning as an alternative to fine-tuning.
By freezing the pre-trained parameters and only updating the added tokens, our approach reduces the number of free parameters and eliminates the need for multiple model copies.
arXiv Detail & Related papers (2023-09-18T20:12:17Z) - Graph Relation Aware Continual Learning [3.908470250825618]
Continual graph learning (CGL) studies the problem of learning from an infinite stream of graph data.
We design a relation-aware adaptive model, dubbed as RAM-CG, that consists of a relation-discovery modular to explore latent relations behind edges.
RAM-CG provides significant 2.2%, 6.9% and 6.6% accuracy improvements over the state-of-the-art results on CitationNet, OGBN-arxiv and TWITCH dataset.
arXiv Detail & Related papers (2023-08-16T09:53:20Z) - Using Motif Transitions for Temporal Graph Generation [0.0]
We develop a practical temporal graph generator to generate synthetic temporal networks with realistic global and local features.
Our key idea is modeling the arrival of new events as temporal motif transition processes.
We demonstrate that our model consistently outperforms the baselines with respect to preserving various global and local temporal graph statistics and runtime performance.
arXiv Detail & Related papers (2023-06-19T22:53:42Z) - Graph Generation with Diffusion Mixture [57.78958552860948]
Generation of graphs is a major challenge for real-world tasks that require understanding the complex nature of their non-Euclidean structures.
We propose a generative framework that models the topology of graphs by explicitly learning the final graph structures of the diffusion process.
arXiv Detail & Related papers (2023-02-07T17:07:46Z) - Generative Diffusion Models on Graphs: Methods and Applications [50.44334458963234]
Diffusion models, as a novel generative paradigm, have achieved remarkable success in various image generation tasks.
Graph generation is a crucial computational task on graphs with numerous real-world applications.
arXiv Detail & Related papers (2023-02-06T06:58:17Z) - FLOWGEN: Fast and slow graph generation [49.21890450444187]
We present FLOWGEN, a graph-generation model inspired by the dual-process theory of mind.
Depending on the difficulty of completing the graph at the current step, graph generation is routed to either a fast(weaker) or a slow(stronger) model.
Experiments on real-world graphs show that ours can successfully generate graphs similar to those generated by a single large model in a fraction of time.
arXiv Detail & Related papers (2022-07-15T16:32:23Z) - Dynamic Graph Learning-Neural Network for Multivariate Time Series
Modeling [2.3022070933226217]
We propose a novel framework, namely static- and dynamic-graph learning-neural network (GL)
The model acquires static and dynamic graph matrices from data to model long-term and short-term patterns respectively.
It achieves state-of-the-art performance on almost all datasets.
arXiv Detail & Related papers (2021-12-06T08:19:15Z) - TCL: Transformer-based Dynamic Graph Modelling via Contrastive Learning [87.38675639186405]
We propose a novel graph neural network approach, called TCL, which deals with the dynamically-evolving graph in a continuous-time fashion.
To the best of our knowledge, this is the first attempt to apply contrastive learning to representation learning on dynamic graphs.
arXiv Detail & Related papers (2021-05-17T15:33:25Z) - TG-GAN: Continuous-time Temporal Graph Generation with Deep Generative
Models [9.75258136573147]
We propose a new model, called Temporal Graph Generative Adversarial Network'' (TG-GAN) for continuous-time temporal graph generation.
We first propose a novel temporal graph generator that jointly model truncated edge sequences, time budgets, and node attributes.
In addition, a new temporal graph discriminator is proposed, which combines time and node encoding operations over a recurrent architecture to distinguish the generated sequences.
arXiv Detail & Related papers (2020-05-17T17:59:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.