Geometric Generative Models based on Morphological Equivariant PDEs and GANs
- URL: http://arxiv.org/abs/2403.14897v3
- Date: Fri, 26 Jul 2024 09:53:21 GMT
- Title: Geometric Generative Models based on Morphological Equivariant PDEs and GANs
- Authors: El Hadji S. Diop, Thierno Fall, Alioune Mbengue, Mohamed Daoudi,
- Abstract summary: We propose a geometric generative model based on an equivariant partial differential equation (PDE) for group convolution neural networks (G-CNNs)
The proposed geometric morphological GAN (GM-GAN) is obtained by using the proposed morphological equivariant convolutions in PDE-G-CNNs.
Preliminary results show that GM-GAN model outperforms classical GAN.
- Score: 3.6498648388765513
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Content and image generation consist in creating or generating data from noisy information by extracting specific features such as texture, edges, and other thin image structures. We are interested here in generative models, and two main problems are addressed. Firstly, the improvements of specific feature extraction while accounting at multiscale levels intrinsic geometric features; and secondly, the equivariance of the network to reduce its complexity and provide a geometric interpretability. To proceed, we propose a geometric generative model based on an equivariant partial differential equation (PDE) for group convolution neural networks (G-CNNs), so called PDE-G-CNNs, built on morphology operators and generative adversarial networks (GANs). Equivariant morphological PDE layers are composed of multiscale dilations and erosions formulated in Riemannian manifolds, while group symmetries are defined on a Lie group. We take advantage of the Lie group structure to properly integrate the equivariance in layers, and are able to use the Riemannian metric to solve the multiscale morphological operations. Each point of the Lie group is associated with a unique point in the manifold, which helps us derive a metric on the Riemannian manifold from a tensor field invariant under the Lie group so that the induced metric has the same symmetries. The proposed geometric morphological GAN (GM-GAN) is obtained by using the proposed morphological equivariant convolutions in PDE-G-CNNs to bring nonlinearity in classical CNNs. GM-GAN is evaluated on MNIST data and compared with GANs. Preliminary results show that GM-GAN model outperforms classical GAN.
Related papers
- Relative Representations: Topological and Geometric Perspectives [53.88896255693922]
Relative representations are an established approach to zero-shot model stitching.
We introduce a normalization procedure in the relative transformation, resulting in invariance to non-isotropic rescalings and permutations.
Second, we propose to deploy topological densification when fine-tuning relative representations, a topological regularization loss encouraging clustering within classes.
arXiv Detail & Related papers (2024-09-17T08:09:22Z) - Relaxing Continuous Constraints of Equivariant Graph Neural Networks for Physical Dynamics Learning [39.25135680793105]
We propose a general Discrete Equivariant Graph Neural Network (DEGNN) that guarantees equivariance to a given discrete point group.
Specifically, we show that such discrete equivariant message passing could be constructed by transforming geometric features into permutation-invariant embeddings.
We show that DEGNN is data efficient, learning with less data, and can generalize across scenarios such as unobserved orientation.
arXiv Detail & Related papers (2024-06-24T03:37:51Z) - E(n) Equivariant Topological Neural Networks [10.603892843083173]
Graph neural networks excel at modeling pairwise interactions, but they cannot flexibly accommodate higher-order interactions and features.
Topological deep learning (TDL) has emerged recently as a promising tool for addressing this issue.
This paper introduces E(n)-Equivariant Topological Neural Networks (ETNNs)
ETNNs incorporate geometric node features while respecting rotation, reflection, and translation.
arXiv Detail & Related papers (2024-05-24T10:55:38Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - ChebLieNet: Invariant Spectral Graph NNs Turned Equivariant by
Riemannian Geometry on Lie Groups [9.195729979000404]
ChebLieNet is a group-equivariant method on (anisotropic) manifold.
We develop a graph neural network made of anisotropic convolutional layers.
We empirically prove the existence of (data-dependent) sweet spots for anisotropic parameters on CIFAR10.
arXiv Detail & Related papers (2021-11-23T20:19:36Z) - Equivariant Neural Network for Factor Graphs [83.26543234955855]
We propose two inference models: Factor-Equivariant Neural Belief Propagation (FE-NBP) and Factor-Equivariant Graph Neural Networks (FE-GNN)
FE-NBP achieves state-of-the-art performance on small datasets while FE-GNN achieves state-of-the-art performance on large datasets.
arXiv Detail & Related papers (2021-09-29T06:54:04Z) - LieTransformer: Equivariant self-attention for Lie Groups [49.9625160479096]
Group equivariant neural networks are used as building blocks of group invariant neural networks.
We extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models.
We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups.
arXiv Detail & Related papers (2020-12-20T11:02:49Z) - Gauge Equivariant Mesh CNNs: Anisotropic convolutions on geometric
graphs [81.12344211998635]
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs)
We propose Gauge Equivariant Mesh CNNs which generalize GCNs to apply anisotropic gauge equivariant kernels.
Our experiments validate the significantly improved expressivity of the proposed model over conventional GCNs and other methods.
arXiv Detail & Related papers (2020-03-11T17:21:15Z) - Generalizing Convolutional Neural Networks for Equivariance to Lie
Groups on Arbitrary Continuous Data [52.78581260260455]
We propose a general method to construct a convolutional layer that is equivariant to transformations from any specified Lie group.
We apply the same model architecture to images, ball-and-stick molecular data, and Hamiltonian dynamical systems.
arXiv Detail & Related papers (2020-02-25T17:40:38Z) - PDE-based Group Equivariant Convolutional Neural Networks [1.949912057689623]
We present a PDE-based framework that generalizes Group equivariant Convolutional Neural Networks (G-CNNs)
In this framework, a network layer is seen as a set of PDE-solvers where geometrically meaningful PDE-coefficients become the layer's trainable weights.
We present experiments to demonstrate the strength of the proposed PDE-G-CNNs in increasing the performance of deep learning based imaging applications.
arXiv Detail & Related papers (2020-01-24T15:00:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.