Unified Fourier-based Kernel and Nonlinearity Design for Equivariant
Networks on Homogeneous Spaces
- URL: http://arxiv.org/abs/2206.08362v1
- Date: Thu, 16 Jun 2022 17:59:01 GMT
- Title: Unified Fourier-based Kernel and Nonlinearity Design for Equivariant
Networks on Homogeneous Spaces
- Authors: Yinshuang Xu and Jiahui Lei and Edgar Dobriban and Kostas Daniilidis
- Abstract summary: We introduce a unified framework for group equivariant networks on homogeneous spaces.
We take advantage of the sparsity of Fourier coefficients of the lifted feature fields.
We show that other methods treating features as the Fourier coefficients in the stabilizer subgroup are special cases of our activation.
- Score: 52.424621227687894
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce a unified framework for group equivariant networks on
homogeneous spaces derived from a Fourier perspective. We address the case of
feature fields being tensor valued before and after a convolutional layer. We
present a unified derivation of kernels via the Fourier domain by taking
advantage of the sparsity of Fourier coefficients of the lifted feature fields.
The sparsity emerges when the stabilizer subgroup of the homogeneous space is a
compact Lie group. We further introduce an activation method via an elementwise
nonlinearity on the regular representation after lifting and projecting back to
the field through an equivariant convolution. We show that other methods
treating features as the Fourier coefficients in the stabilizer subgroup are
special cases of our activation. Experiments on $SO(3)$ and $SE(3)$ show
state-of-the-art performance in spherical vector field regression, point cloud
classification, and molecular completion.
Related papers
- Tensor cumulants for statistical inference on invariant distributions [49.80012009682584]
We show that PCA becomes computationally hard at a critical value of the signal's magnitude.
We define a new set of objects, which provide an explicit, near-orthogonal basis for invariants of a given degree.
It also lets us analyze a new problem of distinguishing between different ensembles.
arXiv Detail & Related papers (2024-04-29T14:33:24Z) - On the Fourier analysis in the SO(3) space : EquiLoPO Network [2.7624021966289605]
Existing deep-learning approaches utilize either group convolutional networks limited to discrete rotations or steerable convolutional networks with constrained filter structures.
This work proposes a novel equivariant neural network architecture that achieves analytical Equivariance to Local Pattern Orientation on the continuous SO(3) group.
By integrating these operations into a ResNet-style architecture, we propose a model that overcomes the limitations of prior methods.
arXiv Detail & Related papers (2024-04-24T16:54:39Z) - Enabling Efficient Equivariant Operations in the Fourier Basis via Gaunt Tensor Products [14.984349569810275]
We propose a systematic approach to accelerate the complexity of the tensor products of irreps.
We introduce the Gaunt Product, which serves as a new method to construct efficient equivariant operations.
Our experiments on the Open Catalyst Project and 3BPA datasets demonstrate both the increased efficiency and improved performance.
arXiv Detail & Related papers (2024-01-18T18:57:10Z) - Harmonics of Learning: Universal Fourier Features Emerge in Invariant Networks [14.259918357897408]
We prove that, under certain conditions, if a neural network is invariant to a finite group then its weights recover the Fourier transform on that group.
This provides a mathematical explanation for the emergence of Fourier features -- a ubiquitous phenomenon in both biological and artificial learning systems.
arXiv Detail & Related papers (2023-12-13T22:42:55Z) - A Geometric Insight into Equivariant Message Passing Neural Networks on
Riemannian Manifolds [1.0878040851638]
We argue that the metric attached to a coordinate-independent feature field should optimally preserve the principal bundle's original metric.
We obtain a message passing scheme on the manifold by discretizing the diffusion equation flow for a fixed time step.
The discretization of the higher-order diffusion process on a graph yields a new general class of equivariant GNN.
arXiv Detail & Related papers (2023-10-16T14:31:13Z) - Third quantization of open quantum systems: new dissipative symmetries
and connections to phase-space and Keldysh field theory formulations [77.34726150561087]
We reformulate the technique of third quantization in a way that explicitly connects all three methods.
We first show that our formulation reveals a fundamental dissipative symmetry present in all quadratic bosonic or fermionic Lindbladians.
For bosons, we then show that the Wigner function and the characteristic function can be thought of as ''wavefunctions'' of the density matrix.
arXiv Detail & Related papers (2023-02-27T18:56:40Z) - Deep Fourier Up-Sampling [100.59885545206744]
Up-sampling in the Fourier domain is more challenging as it does not follow such a local property.
We propose a theoretically sound Deep Fourier Up-Sampling (FourierUp) to solve these issues.
arXiv Detail & Related papers (2022-10-11T06:17:31Z) - VolterraNet: A higher order convolutional network with group
equivariance for homogeneous manifolds [19.39397826006002]
Convolutional neural networks have been highly successful in image-based learning tasks.
Recent work has generalized the traditional convolutional layer of a convolutional neural network to non-Euclidean spaces.
We present a novel higher order Volterra convolutional neural network (VolterraNet) for data defined as samples of functions.
arXiv Detail & Related papers (2021-06-05T19:28:16Z) - LieTransformer: Equivariant self-attention for Lie Groups [49.9625160479096]
Group equivariant neural networks are used as building blocks of group invariant neural networks.
We extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models.
We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups.
arXiv Detail & Related papers (2020-12-20T11:02:49Z) - Learning Set Functions that are Sparse in Non-Orthogonal Fourier Bases [73.53227696624306]
We present a new family of algorithms for learning Fourier-sparse set functions.
In contrast to other work that focused on the Walsh-Hadamard transform, our novel algorithms operate with recently introduced non-orthogonal Fourier transforms.
We demonstrate effectiveness on several real-world applications.
arXiv Detail & Related papers (2020-10-01T14:31:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.