Top-N: Equivariant set and graph generation without exchangeability
- URL: http://arxiv.org/abs/2110.02096v1
- Date: Tue, 5 Oct 2021 14:51:19 GMT
- Title: Top-N: Equivariant set and graph generation without exchangeability
- Authors: Clement Vignac and Pascal Frossard
- Abstract summary: We consider one-shot probabilistic decoders that map a vector-shaped prior to a distribution over sets or graphs.
These functions can be integrated into variational autoencoders (VAE), generative adversarial networks (GAN) or normalizing flows.
Top-n is a deterministic, non-exchangeable set creation mechanism which learns to select the most relevant points from a trainable reference set.
- Score: 61.24699600833916
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We consider one-shot probabilistic decoders that map a vector-shaped prior to
a distribution over sets or graphs. These functions can be integrated into
variational autoencoders (VAE), generative adversarial networks (GAN) or
normalizing flows, and have important applications in drug discovery. Set and
graph generation is most commonly performed by generating points (and sometimes
edge weights) i.i.d. from a normal distribution, and processing them along with
the prior vector using Transformer layers or graph neural networks. This
architecture is designed to generate exchangeable distributions (all
permutations of a set are equally likely) but it is hard to train due to the
stochasticity of i.i.d. generation. We propose a new definition of equivariance
and show that exchangeability is in fact unnecessary in VAEs and GANs. We then
introduce Top-n, a deterministic, non-exchangeable set creation mechanism which
learns to select the most relevant points from a trainable reference set. Top-n
can replace i.i.d. generation in any VAE or GAN -- it is easier to train and
better captures complex dependencies in the data. Top-n outperforms i.i.d
generation by 15% at SetMNIST reconstruction, generates sets that are 64%
closer to the true distribution on a synthetic molecule-like dataset, and is
able to generate more diverse molecules when trained on the classical QM9
dataset. With improved foundations in one-shot generation, our algorithm
contributes to the design of more effective molecule generation methods.
Related papers
- Computing Systemic Risk Measures with Graph Neural Networks [1.6874375111244329]
This paper investigates systemic risk measures for financial networks of explicitly modelled bilateral liabilities.
We study numerical methods for the approximation of systemic risk and optimal random allocations.
arXiv Detail & Related papers (2024-09-30T10:18:13Z) - Improving Out-of-Distribution Robustness of Classifiers via Generative
Interpolation [56.620403243640396]
Deep neural networks achieve superior performance for learning from independent and identically distributed (i.i.d.) data.
However, their performance deteriorates significantly when handling out-of-distribution (OoD) data.
We develop a simple yet effective method called Generative Interpolation to fuse generative models trained from multiple domains for synthesizing diverse OoD samples.
arXiv Detail & Related papers (2023-07-23T03:53:53Z) - SwinGNN: Rethinking Permutation Invariance in Diffusion Models for Graph Generation [15.977241867213516]
Diffusion models based on permutation-equivariant networks can learn permutation-invariant distributions for graph data.
We propose a non-invariant diffusion model, called $textitSwinGNN$, which employs an efficient edge-to-edge 2-WL message passing network.
arXiv Detail & Related papers (2023-07-04T10:58:42Z) - Improving the Sample-Complexity of Deep Classification Networks with
Invariant Integration [77.99182201815763]
Leveraging prior knowledge on intraclass variance due to transformations is a powerful method to improve the sample complexity of deep neural networks.
We propose a novel monomial selection algorithm based on pruning methods to allow an application to more complex problems.
We demonstrate the improved sample complexity on the Rotated-MNIST, SVHN and CIFAR-10 datasets.
arXiv Detail & Related papers (2022-02-08T16:16:11Z) - Beyond permutation equivariance in graph networks [1.713291434132985]
We introduce a novel architecture for graph networks which is equivariant to the Euclidean group in $n$-dimensions.
Our model is designed to work with graph networks in their most general form, thus including particular variants as special cases.
arXiv Detail & Related papers (2021-03-25T18:36:09Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z) - Building powerful and equivariant graph neural networks with structural
message-passing [74.93169425144755]
We propose a powerful and equivariant message-passing framework based on two ideas.
First, we propagate a one-hot encoding of the nodes, in addition to the features, in order to learn a local context matrix around each node.
Second, we propose methods for the parametrization of the message and update functions that ensure permutation equivariance.
arXiv Detail & Related papers (2020-06-26T17:15:16Z) - Permutation Invariant Graph Generation via Score-Based Generative
Modeling [114.12935776726606]
We propose a permutation invariant approach to modeling graphs, using the recent framework of score-based generative modeling.
In particular, we design a permutation equivariant, multi-channel graph neural network to model the gradient of the data distribution at the input graph.
For graph generation, we find that our learning approach achieves better or comparable results to existing models on benchmark datasets.
arXiv Detail & Related papers (2020-03-02T03:06:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.