GSHOT: Few-shot Generative Modeling of Labeled Graphs
- URL: http://arxiv.org/abs/2306.03480v2
- Date: Thu, 14 Dec 2023 07:21:56 GMT
- Title: GSHOT: Few-shot Generative Modeling of Labeled Graphs
- Authors: Sahil Manchanda, Shubham Gupta, Sayan Ranu, Srikanta Bedathur
- Abstract summary: We introduce the hitherto unexplored paradigm of few-shot graph generative modeling.
We develop GSHOT, a framework for few-shot labeled graph generative modeling.
GSHOT adapts to an unseen graph dataset through self-paced fine-tuning.
- Score: 44.94210194611249
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Deep graph generative modeling has gained enormous attraction in recent years
due to its impressive ability to directly learn the underlying hidden graph
distribution. Despite their initial success, these techniques, like much of the
existing deep generative methods, require a large number of training samples to
learn a good model. Unfortunately, large number of training samples may not
always be available in scenarios such as drug discovery for rare diseases. At
the same time, recent advances in few-shot learning have opened door to
applications where available training data is limited. In this work, we
introduce the hitherto unexplored paradigm of few-shot graph generative
modeling. Towards this, we develop GSHOT, a meta-learning based framework for
few-shot labeled graph generative modeling. GSHOT learns to transfer
meta-knowledge from similar auxiliary graph datasets. Utilizing these prior
experiences, GSHOT quickly adapts to an unseen graph dataset through self-paced
fine-tuning. Through extensive experiments on datasets from diverse domains
having limited training samples, we establish that GSHOT generates graphs of
superior fidelity compared to existing baselines.
Related papers
- GraphFM: A Scalable Framework for Multi-Graph Pretraining [2.882104808886318]
We introduce a scalable multi-graph multi-task pretraining approach specifically tailored for node classification tasks across diverse graph datasets from different domains.
We demonstrate the efficacy of our approach by training a model on 152 different graph datasets comprising over 7.4 million nodes and 189 million edges.
Our results show that pretraining on a diverse array of real and synthetic graphs improves the model's adaptability and stability, while performing competitively with state-of-the-art specialist models.
arXiv Detail & Related papers (2024-07-16T16:51:43Z) - Large Generative Graph Models [74.58859158271169]
We propose a new class of graph generative model called Large Graph Generative Model (LGGM)
The pre-trained LGGM has superior zero-shot generative capability to existing graph generative models.
LGGM can be easily fine-tuned with graphs from target domains and demonstrate even better performance than those directly trained from scratch.
arXiv Detail & Related papers (2024-06-07T17:41:47Z) - Generative Diffusion Models on Graphs: Methods and Applications [50.44334458963234]
Diffusion models, as a novel generative paradigm, have achieved remarkable success in various image generation tasks.
Graph generation is a crucial computational task on graphs with numerous real-world applications.
arXiv Detail & Related papers (2023-02-06T06:58:17Z) - A Framework for Large Scale Synthetic Graph Dataset Generation [2.248608623448951]
This work proposes a scalable synthetic graph generation tool to scale the datasets to production-size graphs.
The tool learns a series of parametric models from proprietary datasets that can be released to researchers.
We demonstrate the generalizability of the framework across a series of datasets.
arXiv Detail & Related papers (2022-10-04T22:41:33Z) - Graph Generative Model for Benchmarking Graph Neural Networks [73.11514658000547]
We introduce a novel graph generative model that learns and reproduces the distribution of real-world graphs in a privacy-controlled way.
Our model can successfully generate privacy-controlled, synthetic substitutes of large-scale real-world graphs that can be effectively used to benchmark GNN models.
arXiv Detail & Related papers (2022-07-10T06:42:02Z) - Data-Free Adversarial Knowledge Distillation for Graph Neural Networks [62.71646916191515]
We propose the first end-to-end framework for data-free adversarial knowledge distillation on graph structured data (DFAD-GNN)
To be specific, our DFAD-GNN employs a generative adversarial network, which mainly consists of three components: a pre-trained teacher model and a student model are regarded as two discriminators, and a generator is utilized for deriving training graphs to distill knowledge from the teacher model into the student model.
Our DFAD-GNN significantly surpasses state-of-the-art data-free baselines in the graph classification task.
arXiv Detail & Related papers (2022-05-08T08:19:40Z) - Generating a Doppelganger Graph: Resembling but Distinct [5.618335078130568]
We propose an approach to generating a doppelganger graph that resembles a given one in many graph properties.
The approach is an orchestration of graph representation learning, generative adversarial networks, and graph realization algorithms.
arXiv Detail & Related papers (2021-01-23T22:08:27Z) - Neural Language Modeling for Contextualized Temporal Graph Generation [49.21890450444187]
This paper presents the first study on using large-scale pre-trained language models for automated generation of an event-level temporal graph for a document.
arXiv Detail & Related papers (2020-10-20T07:08:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.