Let There Be Order: Rethinking Ordering in Autoregressive Graph
Generation
- URL: http://arxiv.org/abs/2305.15562v1
- Date: Wed, 24 May 2023 20:52:34 GMT
- Title: Let There Be Order: Rethinking Ordering in Autoregressive Graph
Generation
- Authors: Jie Bu, Kazi Sajeed Mehrab, Anuj Karpatne
- Abstract summary: Conditional graph generation tasks involve training a model to generate a graph given a set of input conditions.
Many previous studies employ autoregressive models to incrementally generate graph components such as nodes and edges.
As graphs typically lack a natural ordering among their components, converting a graph into a sequence of tokens is not straightforward.
- Score: 6.422073551199993
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Conditional graph generation tasks involve training a model to generate a
graph given a set of input conditions. Many previous studies employ
autoregressive models to incrementally generate graph components such as nodes
and edges. However, as graphs typically lack a natural ordering among their
components, converting a graph into a sequence of tokens is not
straightforward. While prior works mostly rely on conventional heuristics or
graph traversal methods like breadth-first search (BFS) or depth-first search
(DFS) to convert graphs to sequences, the impact of ordering on graph
generation has largely been unexplored. This paper contributes to this problem
by: (1) highlighting the crucial role of ordering in autoregressive graph
generation models, (2) proposing a novel theoretical framework that perceives
ordering as a dimensionality reduction problem, thereby facilitating a deeper
understanding of the relationship between orderings and generated graph
accuracy, and (3) introducing "latent sort," a learning-based ordering scheme
to perform dimensionality reduction of graph tokens. Our experimental results
showcase the effectiveness of latent sort across a wide range of graph
generation tasks, encouraging future works to further explore and develop
learning-based ordering schemes for autoregressive graph generation.
Related papers
- Neural Graph Generator: Feature-Conditioned Graph Generation using Latent Diffusion Models [22.794561387716502]
We introduce the Neural Graph Generator (NGG), a novel approach which utilizes conditioned latent diffusion models for graph generation.
NGG demonstrates a remarkable capacity to model complex graph patterns, offering control over the graph generation process.
arXiv Detail & Related papers (2024-03-03T15:28:47Z) - Overcoming Order in Autoregressive Graph Generation [12.351817671944515]
Graph generation is a fundamental problem in various domains, including chemistry and social networks.
Recent work has shown that molecular graph generation using recurrent neural networks (RNNs) is advantageous compared to traditional generative approaches.
arXiv Detail & Related papers (2024-02-04T09:58:22Z) - Deep Prompt Tuning for Graph Transformers [55.2480439325792]
Fine-tuning is resource-intensive and requires storing multiple copies of large models.
We propose a novel approach called deep graph prompt tuning as an alternative to fine-tuning.
By freezing the pre-trained parameters and only updating the added tokens, our approach reduces the number of free parameters and eliminates the need for multiple model copies.
arXiv Detail & Related papers (2023-09-18T20:12:17Z) - Graph Generation with Diffusion Mixture [57.78958552860948]
Generation of graphs is a major challenge for real-world tasks that require understanding the complex nature of their non-Euclidean structures.
We propose a generative framework that models the topology of graphs by explicitly learning the final graph structures of the diffusion process.
arXiv Detail & Related papers (2023-02-07T17:07:46Z) - Spectral Augmentations for Graph Contrastive Learning [50.149996923976836]
Contrastive learning has emerged as a premier method for learning representations with or without supervision.
Recent studies have shown its utility in graph representation learning for pre-training.
We propose a set of well-motivated graph transformation operations to provide a bank of candidates when constructing augmentations for a graph contrastive objective.
arXiv Detail & Related papers (2023-02-06T16:26:29Z) - Explanation Graph Generation via Pre-trained Language Models: An
Empirical Study with Contrastive Learning [84.35102534158621]
We study pre-trained language models that generate explanation graphs in an end-to-end manner.
We propose simple yet effective ways of graph perturbations via node and edge edit operations.
Our methods lead to significant improvements in both structural and semantic accuracy of explanation graphs.
arXiv Detail & Related papers (2022-04-11T00:58:27Z) - Order Matters: Probabilistic Modeling of Node Sequence for Graph
Generation [18.03898476141173]
A graph generative model defines a distribution over graphs.
We derive the exact joint probability over the graph and the node ordering of the sequential process.
We train graph generative models by maximizing this bound, without using the ad-hoc node orderings of previous methods.
arXiv Detail & Related papers (2021-06-11T06:37:52Z) - Graph2Graph Learning with Conditional Autoregressive Models [8.203106789678397]
We present a conditional auto-re model for graph-to-graph learning.
We illustrate its representational capabilities via experiments on challenging subgraph predictions from graph algorithmics.
arXiv Detail & Related papers (2021-06-06T20:28:07Z) - Line Graph Neural Networks for Link Prediction [71.00689542259052]
We consider the graph link prediction task, which is a classic graph analytical problem with many real-world applications.
In this formalism, a link prediction problem is converted to a graph classification task.
We propose to seek a radically different and novel path by making use of the line graphs in graph theory.
In particular, each node in a line graph corresponds to a unique edge in the original graph. Therefore, link prediction problems in the original graph can be equivalently solved as a node classification problem in its corresponding line graph, instead of a graph classification task.
arXiv Detail & Related papers (2020-10-20T05:54:31Z) - Contrastive Self-supervised Learning for Graph Classification [21.207647143672585]
We propose two approaches based on contrastive self-supervised learning (CSSL) to alleviate overfitting.
In the first approach, we use CSSL to pretrain graph encoders on widely-available unlabeled graphs without relying on human-provided labels.
In the second approach, we develop a regularizer based on CSSL, and solve the supervised classification task and the unsupervised CSSL task simultaneously.
arXiv Detail & Related papers (2020-09-13T05:12:55Z) - Second-Order Pooling for Graph Neural Networks [62.13156203025818]
We propose to use second-order pooling as graph pooling, which naturally solves the above challenges.
We show that direct use of second-order pooling with graph neural networks leads to practical problems.
We propose two novel global graph pooling methods based on second-order pooling; namely, bilinear mapping and attentional second-order pooling.
arXiv Detail & Related papers (2020-07-20T20:52:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.