Equivariant Manifold Neural ODEs and Differential Invariants
- URL: http://arxiv.org/abs/2401.14131v2
- Date: Thu, 10 Oct 2024 14:22:30 GMT
- Title: Equivariant Manifold Neural ODEs and Differential Invariants
- Authors: Emma Andersdotter, Daniel Persson, Fredrik Ohlsson,
- Abstract summary: We develop a manifestly geometric framework for equivariant manifold neural ordinary differential equations (NODEs)
We use it to analyse their modelling capabilities for symmetric data.
- Score: 1.6073704837297416
- License:
- Abstract: In this paper, we develop a manifestly geometric framework for equivariant manifold neural ordinary differential equations (NODEs) and use it to analyse their modelling capabilities for symmetric data. First, we consider the action of a Lie group $G$ on a smooth manifold $M$ and establish the equivalence between equivariance of vector fields, symmetries of the corresponding Cauchy problems, and equivariance of the associated NODEs. We also propose a novel formulation, based on Lie theory for symmetries of differential equations, of the equivariant manifold NODEs in terms of the differential invariants of the action of $G$ on $M$, which provides an efficient parameterisation of the space of equivariant vector fields in a way that is agnostic to both the manifold $M$ and the symmetry group $G$. Second, we construct augmented manifold NODEs, through embeddings into flows on the tangent bundle $TM$, and show that they are universal approximators of diffeomorphisms on any connected $M$. Furthermore, we show that universality persists in the equivariant case and that the augmented equivariant manifold NODEs can be incorporated into the geometric framework using higher-order differential invariants. Finally, we consider the induced action of $G$ on different fields on $M$ and show how it can be used to generalise previous work, on, e.g., continuous normalizing flows, to equivariant models in any geometry.
Related papers
- Equivariant Graph Network Approximations of High-Degree Polynomials for Force Field Prediction [62.05532524197309]
equivariant deep models have shown promise in accurately predicting atomic potentials and force fields in molecular dynamics simulations.
In this work, we analyze the equivariant functions for equivariant architecture, and introduce a novel equivariant network, named PACE.
As experimented in commonly used benchmarks, PACE demonstrates state-of-the-art performance in predicting atomic energy and force fields.
arXiv Detail & Related papers (2024-11-06T19:34:40Z) - Equivariant score-based generative models provably learn distributions with symmetries efficiently [7.90752151686317]
Empirical studies have demonstrated that incorporating symmetries into generative models can provide better generalization and sampling efficiency.
We provide the first theoretical analysis and guarantees of score-based generative models (SGMs) for learning distributions that are invariant with respect to some group symmetry.
arXiv Detail & Related papers (2024-10-02T05:14:28Z) - Geometric Generative Models based on Morphological Equivariant PDEs and GANs [3.6498648388765513]
We propose a geometric generative model based on an equivariant partial differential equation (PDE) for group convolution neural networks (G-CNNs)
The proposed geometric morphological GAN (GM-GAN) is obtained by using the proposed morphological equivariant convolutions in PDE-G-CNNs.
Preliminary results show that GM-GAN model outperforms classical GAN.
arXiv Detail & Related papers (2024-03-22T01:02:09Z) - A Geometric Insight into Equivariant Message Passing Neural Networks on
Riemannian Manifolds [1.0878040851638]
We argue that the metric attached to a coordinate-independent feature field should optimally preserve the principal bundle's original metric.
We obtain a message passing scheme on the manifold by discretizing the diffusion equation flow for a fixed time step.
The discretization of the higher-order diffusion process on a graph yields a new general class of equivariant GNN.
arXiv Detail & Related papers (2023-10-16T14:31:13Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Equivalence Between SE(3) Equivariant Networks via Steerable Kernels and
Group Convolution [90.67482899242093]
A wide range of techniques have been proposed in recent years for designing neural networks for 3D data that are equivariant under rotation and translation of the input.
We provide an in-depth analysis of both methods and their equivalence and relate the two constructions to multiview convolutional networks.
We also derive new TFN non-linearities from our equivalence principle and test them on practical benchmark datasets.
arXiv Detail & Related papers (2022-11-29T03:42:11Z) - Equivariant Discrete Normalizing Flows [10.867162810786361]
We focus on building equivariant normalizing flows using discrete layers.
We introduce two new equivariant flows: $G$-coupling Flows and $G$-Residual Flows.
Our construction of $G$-Residual Flows are also universal, in the sense that we prove an $G$-equivariant diffeomorphism can be exactly mapped by a $G$-residual flow.
arXiv Detail & Related papers (2021-10-16T20:16:00Z) - Equivariant Manifold Flows [48.21296508399746]
We lay the theoretical foundations for learning symmetry-invariant distributions on arbitrary manifold via equivariant manifold flows.
We demonstrate the utility of our approach by using it to learn gauge invariant densities over $SU(n)$ in the context of quantum field theory.
arXiv Detail & Related papers (2021-07-19T03:04:44Z) - Geometric Deep Learning and Equivariant Neural Networks [0.9381376621526817]
We survey the mathematical foundations of geometric deep learning, focusing on group equivariant and gauge equivariant neural networks.
We develop gauge equivariant convolutional neural networks on arbitrary manifold $mathcalM$ using principal bundles with structure group $K$ and equivariant maps between sections of associated vector bundles.
We analyze several applications of this formalism, including semantic segmentation and object detection networks.
arXiv Detail & Related papers (2021-05-28T15:41:52Z) - A Differential Geometry Perspective on Orthogonal Recurrent Models [56.09491978954866]
We employ tools and insights from differential geometry to offer a novel perspective on orthogonal RNNs.
We show that orthogonal RNNs may be viewed as optimizing in the space of divergence-free vector fields.
Motivated by this observation, we study a new recurrent model, which spans the entire space of vector fields.
arXiv Detail & Related papers (2021-02-18T19:39:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.