Scale-Rotation-Equivariant Lie Group Convolution Neural Networks (Lie
Group-CNNs)
- URL: http://arxiv.org/abs/2306.06934v1
- Date: Mon, 12 Jun 2023 08:14:12 GMT
- Title: Scale-Rotation-Equivariant Lie Group Convolution Neural Networks (Lie
Group-CNNs)
- Authors: Wei-Dong Qiao, Yang Xu, and Hui Li
- Abstract summary: This study proposes a Lie group-CNN, which can keep scale-rotation-equivariance for image classification tasks.
The Lie group-CNN can successfully extract geometric features and performs equivariant recognition on images with rotation and scale transformations.
- Score: 5.498285766353742
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The weight-sharing mechanism of convolutional kernels ensures
translation-equivariance of convolution neural networks (CNNs). Recently,
rotation-equivariance has been investigated. However, research on
scale-equivariance or simultaneous scale-rotation-equivariance is insufficient.
This study proposes a Lie group-CNN, which can keep scale-rotation-equivariance
for image classification tasks. The Lie group-CNN includes a lifting module, a
series of group convolution modules, a global pooling layer, and a
classification layer. The lifting module transfers the input image from
Euclidean space to Lie group space, and the group convolution is parameterized
through a fully connected network using Lie-algebra of Lie-group elements as
inputs to achieve scale-rotation-equivariance. The Lie group SIM(2) is utilized
to establish the Lie group-CNN with scale-rotation-equivariance.
Scale-rotation-equivariance of Lie group-CNN is verified and achieves the best
recognition accuracy on the blood cell dataset (97.50%) and the HAM10000
dataset (77.90%) superior to Lie algebra convolution network, dilation
convolution, spatial transformer network, and scale-equivariant steerable
network. In addition, the generalization ability of the Lie group-CNN on SIM(2)
on rotation-equivariance is verified on rotated-MNIST and rotated-CIFAR10, and
the robustness of the network is verified on SO(2) and SE(2). Therefore, the
Lie group-CNN can successfully extract geometric features and performs
equivariant recognition on images with rotation and scale transformations.
Related papers
- Symmetry Discovery for Different Data Types [52.2614860099811]
Equivariant neural networks incorporate symmetries into their architecture, achieving higher generalization performance.
We propose LieSD, a method for discovering symmetries via trained neural networks which approximate the input-output mappings of the tasks.
We validate the performance of LieSD on tasks with symmetries such as the two-body problem, the moment of inertia matrix prediction, and top quark tagging.
arXiv Detail & Related papers (2024-10-13T13:39:39Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Deep Neural Networks with Efficient Guaranteed Invariances [77.99182201815763]
We address the problem of improving the performance and in particular the sample complexity of deep neural networks.
Group-equivariant convolutions are a popular approach to obtain equivariant representations.
We propose a multi-stream architecture, where each stream is invariant to a different transformation.
arXiv Detail & Related papers (2023-03-02T20:44:45Z) - Learning Invariant Representations for Equivariant Neural Networks Using
Orthogonal Moments [9.680414207552722]
The convolutional layers of standard convolutional neural networks (CNNs) are equivariant to translation.
Recently, a new class of CNNs is proposed in which the conventional layers of CNNs are replaced with equivariant convolution, pooling, and batch-normalization layers.
arXiv Detail & Related papers (2022-09-22T11:48:39Z) - Equivariance versus Augmentation for Spherical Images [0.7388859384645262]
We analyze the role of rotational equivariance in convolutional neural networks (CNNs) applied to spherical images.
We compare the performance of the group equivariant networks known as S2CNNs and standard non-equivariant CNNs trained with an increasing amount of data augmentation.
arXiv Detail & Related papers (2022-02-08T16:49:30Z) - Improving the Sample-Complexity of Deep Classification Networks with
Invariant Integration [77.99182201815763]
Leveraging prior knowledge on intraclass variance due to transformations is a powerful method to improve the sample complexity of deep neural networks.
We propose a novel monomial selection algorithm based on pruning methods to allow an application to more complex problems.
We demonstrate the improved sample complexity on the Rotated-MNIST, SVHN and CIFAR-10 datasets.
arXiv Detail & Related papers (2022-02-08T16:16:11Z) - Exploiting Redundancy: Separable Group Convolutional Networks on Lie
Groups [14.029933823101084]
Group convolutional neural networks (G-CNNs) have been shown to increase parameter efficiency and model accuracy.
In this work, we investigate the properties of representations learned by regular G-CNNs, and show considerable parameter redundancy in group convolution kernels.
We introduce convolution kernels that are separable over the subgroup and channel dimensions.
arXiv Detail & Related papers (2021-10-25T15:56:53Z) - Group Equivariant Conditional Neural Processes [30.134634059773703]
We present the group equivariant conditional neural process (EquivCNP)
We show that EquivCNP achieves comparable performance to conventional conditional neural processes in a 1D regression task.
arXiv Detail & Related papers (2021-02-17T13:50:07Z) - LieTransformer: Equivariant self-attention for Lie Groups [49.9625160479096]
Group equivariant neural networks are used as building blocks of group invariant neural networks.
We extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models.
We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups.
arXiv Detail & Related papers (2020-12-20T11:02:49Z) - Self-grouping Convolutional Neural Networks [30.732298624941738]
We propose a novel method of designing self-grouping convolutional neural networks, called SG-CNN.
For each filter, we first evaluate the importance value of their input channels to identify the importance vectors.
Using the resulting emphdata-dependent centroids, we prune the less important connections, which implicitly minimizes the accuracy loss of the pruning.
arXiv Detail & Related papers (2020-09-29T06:24:32Z) - Generalizing Convolutional Neural Networks for Equivariance to Lie
Groups on Arbitrary Continuous Data [52.78581260260455]
We propose a general method to construct a convolutional layer that is equivariant to transformations from any specified Lie group.
We apply the same model architecture to images, ball-and-stick molecular data, and Hamiltonian dynamical systems.
arXiv Detail & Related papers (2020-02-25T17:40:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.