A Geometric Insight into Equivariant Message Passing Neural Networks on
Riemannian Manifolds
- URL: http://arxiv.org/abs/2310.10448v1
- Date: Mon, 16 Oct 2023 14:31:13 GMT
- Title: A Geometric Insight into Equivariant Message Passing Neural Networks on
Riemannian Manifolds
- Authors: Ilyes Batatia
- Abstract summary: We argue that the metric attached to a coordinate-independent feature field should optimally preserve the principal bundle's original metric.
We obtain a message passing scheme on the manifold by discretizing the diffusion equation flow for a fixed time step.
The discretization of the higher-order diffusion process on a graph yields a new general class of equivariant GNN.
- Score: 1.0878040851638
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work proposes a geometric insight into equivariant message passing on
Riemannian manifolds. As previously proposed, numerical features on Riemannian
manifolds are represented as coordinate-independent feature fields on the
manifold. To any coordinate-independent feature field on a manifold comes
attached an equivariant embedding of the principal bundle to the space of
numerical features. We argue that the metric this embedding induces on the
numerical feature space should optimally preserve the principal bundle's
original metric. This optimality criterion leads to the minimization of a
twisted form of the Polyakov action with respect to the graph of this
embedding, yielding an equivariant diffusion process on the associated vector
bundle. We obtain a message passing scheme on the manifold by discretizing the
diffusion equation flow for a fixed time step. We propose a higher-order
equivariant diffusion process equivalent to diffusion on the cartesian product
of the base manifold. The discretization of the higher-order diffusion process
on a graph yields a new general class of equivariant GNN, generalizing the ACE
and MACE formalism to data on Riemannian manifolds.
Related papers
- Equivariant Manifold Neural ODEs and Differential Invariants [1.6073704837297416]
We develop a manifestly geometric framework for equivariant manifold neural ordinary differential equations (NODEs)
We use it to analyse their modelling capabilities for symmetric data.
arXiv Detail & Related papers (2024-01-25T12:23:22Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Generative Modeling on Manifolds Through Mixture of Riemannian Diffusion Processes [57.396578974401734]
We introduce a principled framework for building a generative diffusion process on general manifold.
Instead of following the denoising approach of previous diffusion models, we construct a diffusion process using a mixture of bridge processes.
We develop a geometric understanding of the mixture process, deriving the drift as a weighted mean of tangent directions to the data points.
arXiv Detail & Related papers (2023-10-11T06:04:40Z) - Decentralized Riemannian Conjugate Gradient Method on the Stiefel
Manifold [59.73080197971106]
This paper presents a first-order conjugate optimization method that converges faster than the steepest descent method.
It aims to achieve global convergence over the Stiefel manifold.
arXiv Detail & Related papers (2023-08-21T08:02:16Z) - The Manifold Scattering Transform for High-Dimensional Point Cloud Data [16.500568323161563]
We present practical schemes for implementing the manifold scattering transform to datasets arising in naturalistic systems.
We show that our methods are effective for signal classification and manifold classification tasks.
arXiv Detail & Related papers (2022-06-21T02:15:00Z) - Unified Fourier-based Kernel and Nonlinearity Design for Equivariant
Networks on Homogeneous Spaces [52.424621227687894]
We introduce a unified framework for group equivariant networks on homogeneous spaces.
We take advantage of the sparsity of Fourier coefficients of the lifted feature fields.
We show that other methods treating features as the Fourier coefficients in the stabilizer subgroup are special cases of our activation.
arXiv Detail & Related papers (2022-06-16T17:59:01Z) - Differential Privacy Over Riemannian Manifolds [9.453554184019108]
We present an extension of the Laplace or K-norm mechanism that utilizes intrinsic distances and volumes on the manifold.
We demonstrate that our mechanism is rate optimal and depends only on the dimension of the manifold, not on the dimension of any ambient space.
arXiv Detail & Related papers (2021-11-03T20:43:54Z) - Coordinate Independent Convolutional Networks -- Isometry and Gauge
Equivariant Convolutions on Riemannian Manifolds [70.32518963244466]
A major complication in comparison to flat spaces is that it is unclear in which alignment a convolution kernel should be applied on a manifold.
We argue that the particular choice of coordinatization should not affect a network's inference -- it should be coordinate independent.
A simultaneous demand for coordinate independence and weight sharing is shown to result in a requirement on the network to be equivariant.
arXiv Detail & Related papers (2021-06-10T19:54:19Z) - A Unifying and Canonical Description of Measure-Preserving Diffusions [60.59592461429012]
A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
arXiv Detail & Related papers (2021-05-06T17:36:55Z) - Nested Grassmannians for Dimensionality Reduction with Applications [7.106986689736826]
We propose a novel framework for constructing a nested sequence of homogeneous Riemannian manifold.
We focus on applying the proposed framework to the Grassmann manifold, giving rise to the nested Grassmannians (NG)
Specifically, each planar (2D) shape can be represented as a point in the complex projective space which is a complex Grass-mann manifold.
With the proposed NG structure, we develop algorithms for the supervised and unsupervised dimensionality reduction problems respectively.
arXiv Detail & Related papers (2020-10-27T20:09:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.