SO(3)-Equivariant Neural Networks for Learning Vector Fields on Spheres
- URL: http://arxiv.org/abs/2503.09456v1
- Date: Wed, 12 Mar 2025 15:00:32 GMT
- Title: SO(3)-Equivariant Neural Networks for Learning Vector Fields on Spheres
- Authors: Francesco Ballerin, Nello Blaser, Erlend Grong,
- Abstract summary: Models should respect both the rotational symmetries of the sphere and the inherent symmetries of the vector fields.<n>We introduce a deep learning architecture that respects both symmetry types using novel techniques based on group convolutions in the 3-dimensional rotation group.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Analyzing vector fields on the sphere, such as wind speed and direction on Earth, is a difficult task. Models should respect both the rotational symmetries of the sphere and the inherent symmetries of the vector fields. In this paper, we introduce a deep learning architecture that respects both symmetry types using novel techniques based on group convolutions in the 3-dimensional rotation group. This architecture is suitable for scalar and vector fields on the sphere as they can be described as equivariant signals on the 3-dimensional rotation group. Experiments show that our architecture achieves lower prediction and reconstruction error when tested on rotated data compared to both standard CNNs and spherical CNNs.
Related papers
- An Intrinsic Vector Heat Network [64.55434397799728]
This paper introduces a novel neural network architecture for learning tangent vector fields embedded in 3D.
We introduce a trainable vector heat diffusion module to spatially propagate vector-valued feature data across the surface.
We also demonstrate the effectiveness of our method on the useful industrial application of quadrilateral mesh generation.
arXiv Detail & Related papers (2024-06-14T00:40:31Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - A Hitchhiker's Guide to Geometric GNNs for 3D Atomic Systems [87.30652640973317]
Recent advances in computational modelling of atomic systems represent them as geometric graphs with atoms embedded as nodes in 3D Euclidean space.
Geometric Graph Neural Networks have emerged as the preferred machine learning architecture powering applications ranging from protein structure prediction to molecular simulations and material generation.
This paper provides a comprehensive and self-contained overview of the field of Geometric GNNs for 3D atomic systems.
arXiv Detail & Related papers (2023-12-12T18:44:19Z) - Fully Steerable 3D Spherical Neurons [14.86655504533083]
We propose a steerable feed-forward learning-based approach that consists of spherical decision surfaces and operates on point clouds.
Due to the inherent geometric 3D structure of our theory, we derive a 3D steerability constraint for its atomic parts.
We show how the model parameters are fully steerable at inference time.
arXiv Detail & Related papers (2021-06-02T16:30:02Z) - A Differential Geometry Perspective on Orthogonal Recurrent Models [56.09491978954866]
We employ tools and insights from differential geometry to offer a novel perspective on orthogonal RNNs.
We show that orthogonal RNNs may be viewed as optimizing in the space of divergence-free vector fields.
Motivated by this observation, we study a new recurrent model, which spans the entire space of vector fields.
arXiv Detail & Related papers (2021-02-18T19:39:22Z) - Spherical Transformer: Adapting Spherical Signal to CNNs [53.18482213611481]
Spherical Transformer can transform spherical signals into vectors that can be directly processed by standard CNNs.
We evaluate our approach on the tasks of spherical MNIST recognition, 3D object classification and omnidirectional image semantic segmentation.
arXiv Detail & Related papers (2021-01-11T12:33:16Z) - Rotation-Invariant Autoencoders for Signals on Spheres [10.406659081400354]
We study the problem of unsupervised learning of rotation-invariant representations for spherical images.
In particular, we design an autoencoder architecture consisting of $S2$ and $SO(3)$ convolutional layers.
Experiments on multiple datasets demonstrate the usefulness of the learned representations on clustering, retrieval and classification applications.
arXiv Detail & Related papers (2020-12-08T15:15:03Z) - Learning Equivariant Representations [10.745691354609738]
Convolutional neural networks (CNNs) are successful examples of this principle.
We propose equivariant models for different transformations defined by groups of symmetries.
These models leverage symmetries in the data to reduce sample and model complexity and improve generalization performance.
arXiv Detail & Related papers (2020-12-04T18:46:17Z) - Spherical Convolutional Neural Networks: Stability to Perturbations in
SO(3) [175.96910854433574]
Spherical convolutional neural networks (Spherical CNNs) learn nonlinear representations from 3D data by exploiting the data structure.
This paper investigates the properties that Spherical CNNs exhibit as they pertain to the rotational structure inherent in spherical signals.
arXiv Detail & Related papers (2020-10-12T17:16:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.