Affine Invariance in Continuous-Domain Convolutional Neural Networks
- URL: http://arxiv.org/abs/2311.09245v1
- Date: Mon, 13 Nov 2023 14:17:57 GMT
- Title: Affine Invariance in Continuous-Domain Convolutional Neural Networks
- Authors: Ali Mohaddes, Johannes Lederer
- Abstract summary: This research studies affine invariance on continuous-domain convolutional neural networks.
We introduce a new criterion to assess the similarity of two input signals under affine transformations.
Our research could eventually extend the scope of geometrical transformations that practical deep-learning pipelines can handle.
- Score: 6.019182604573028
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The notion of group invariance helps neural networks in recognizing patterns
and features under geometric transformations. Indeed, it has been shown that
group invariance can largely improve deep learning performances in practice,
where such transformations are very common. This research studies affine
invariance on continuous-domain convolutional neural networks. Despite other
research considering isometric invariance or similarity invariance, we focus on
the full structure of affine transforms generated by the generalized linear
group $\mathrm{GL}_2(\mathbb{R})$. We introduce a new criterion to assess the
similarity of two input signals under affine transformations. Then, unlike
conventional methods that involve solving complex optimization problems on the
Lie group $G_2$, we analyze the convolution of lifted signals and compute the
corresponding integration over $G_2$. In sum, our research could eventually
extend the scope of geometrical transformations that practical deep-learning
pipelines can handle.
Related papers
- Recursive Self-Similarity in Deep Weight Spaces of Neural Architectures: A Fractal and Coarse Geometry Perspective [2.9130383514140292]
This paper conceptualizes the Deep Weight Spaces as hierarchical, fractal-like, coarse geometric structures observable at discrete integer scales.
We introduce a coarse group action termed the fractal transformation, $T_r_k $, acting under the symmetry group $G = (mathbbZ, +) $.
This perspective adopts a box count technique, commonly used to assess the hierarchical and scale-related geometry of physical structures.
arXiv Detail & Related papers (2025-03-18T14:41:23Z) - Relative Representations: Topological and Geometric Perspectives [53.88896255693922]
Relative representations are an established approach to zero-shot model stitching.
We introduce a normalization procedure in the relative transformation, resulting in invariance to non-isotropic rescalings and permutations.
Second, we propose to deploy topological densification when fine-tuning relative representations, a topological regularization loss encouraging clustering within classes.
arXiv Detail & Related papers (2024-09-17T08:09:22Z) - Lie Group Decompositions for Equivariant Neural Networks [12.139222986297261]
We show how convolution kernels can be parametrized to build models equivariant with respect to affine transformations.
We evaluate the robustness and out-of-distribution generalisation capability of our model on the benchmark affine-invariant classification task.
arXiv Detail & Related papers (2023-10-17T16:04:33Z) - Deep Neural Networks with Efficient Guaranteed Invariances [77.99182201815763]
We address the problem of improving the performance and in particular the sample complexity of deep neural networks.
Group-equivariant convolutions are a popular approach to obtain equivariant representations.
We propose a multi-stream architecture, where each stream is invariant to a different transformation.
arXiv Detail & Related papers (2023-03-02T20:44:45Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Implicit Convolutional Kernels for Steerable CNNs [5.141137421503899]
Steerable convolutional neural networks (CNNs) provide a general framework for building neural networks equivariant to translations and transformations of an origin-preserving group $G$.
We propose using implicit neural representation via multi-layer perceptrons (MLPs) to parameterize $G$-steerable kernels.
We prove the effectiveness of our method on multiple tasks, including N-body simulations, point cloud classification and molecular property prediction.
arXiv Detail & Related papers (2022-12-12T18:10:33Z) - Equivalence Between SE(3) Equivariant Networks via Steerable Kernels and
Group Convolution [90.67482899242093]
A wide range of techniques have been proposed in recent years for designing neural networks for 3D data that are equivariant under rotation and translation of the input.
We provide an in-depth analysis of both methods and their equivalence and relate the two constructions to multiview convolutional networks.
We also derive new TFN non-linearities from our equivalence principle and test them on practical benchmark datasets.
arXiv Detail & Related papers (2022-11-29T03:42:11Z) - Learning Invariant Representations for Equivariant Neural Networks Using
Orthogonal Moments [9.680414207552722]
The convolutional layers of standard convolutional neural networks (CNNs) are equivariant to translation.
Recently, a new class of CNNs is proposed in which the conventional layers of CNNs are replaced with equivariant convolution, pooling, and batch-normalization layers.
arXiv Detail & Related papers (2022-09-22T11:48:39Z) - Universality of group convolutional neural networks based on ridgelet
analysis on groups [10.05944106581306]
We investigate the approximation property of group convolutional neural networks (GCNNs) based on the ridgelet theory.
We formulate a versatile GCNN as a nonlinear mapping between group representations.
arXiv Detail & Related papers (2022-05-30T02:52:22Z) - On the Effective Number of Linear Regions in Shallow Univariate ReLU
Networks: Convergence Guarantees and Implicit Bias [50.84569563188485]
We show that gradient flow converges in direction when labels are determined by the sign of a target network with $r$ neurons.
Our result may already hold for mild over- parameterization, where the width is $tildemathcalO(r)$ and independent of the sample size.
arXiv Detail & Related papers (2022-05-18T16:57:10Z) - Revisiting Transformation Invariant Geometric Deep Learning: Are Initial
Representations All You Need? [80.86819657126041]
We show that transformation-invariant and distance-preserving initial representations are sufficient to achieve transformation invariance.
Specifically, we realize transformation-invariant and distance-preserving initial point representations by modifying multi-dimensional scaling.
We prove that TinvNN can strictly guarantee transformation invariance, being general and flexible enough to be combined with the existing neural networks.
arXiv Detail & Related papers (2021-12-23T03:52:33Z) - Topographic VAEs learn Equivariant Capsules [84.33745072274942]
We introduce the Topographic VAE: a novel method for efficiently training deep generative models with topographically organized latent variables.
We show that such a model indeed learns to organize its activations according to salient characteristics such as digit class, width, and style on MNIST.
We demonstrate approximate equivariance to complex transformations, expanding upon the capabilities of existing group equivariant neural networks.
arXiv Detail & Related papers (2021-09-03T09:25:57Z) - Geometric Deep Learning and Equivariant Neural Networks [0.9381376621526817]
We survey the mathematical foundations of geometric deep learning, focusing on group equivariant and gauge equivariant neural networks.
We develop gauge equivariant convolutional neural networks on arbitrary manifold $mathcalM$ using principal bundles with structure group $K$ and equivariant maps between sections of associated vector bundles.
We analyze several applications of this formalism, including semantic segmentation and object detection networks.
arXiv Detail & Related papers (2021-05-28T15:41:52Z) - Equivariant neural networks for inverse problems [1.7942265700058986]
We show that group equivariant convolutional operations can naturally be incorporated into learned reconstruction methods.
We design learned iterative methods in which the proximal operators are modelled as group equivariant convolutional neural networks.
arXiv Detail & Related papers (2021-02-23T05:38:41Z) - Stochastic Flows and Geometric Optimization on the Orthogonal Group [52.50121190744979]
We present a new class of geometrically-driven optimization algorithms on the orthogonal group $O(d)$.
We show that our methods can be applied in various fields of machine learning including deep, convolutional and recurrent neural networks, reinforcement learning, flows and metric learning.
arXiv Detail & Related papers (2020-03-30T15:37:50Z) - Generalizing Convolutional Neural Networks for Equivariance to Lie
Groups on Arbitrary Continuous Data [52.78581260260455]
We propose a general method to construct a convolutional layer that is equivariant to transformations from any specified Lie group.
We apply the same model architecture to images, ball-and-stick molecular data, and Hamiltonian dynamical systems.
arXiv Detail & Related papers (2020-02-25T17:40:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.