Rotated Ring, Radial and Depth Wise Separable Radial Convolutions
- URL: http://arxiv.org/abs/2010.00873v3
- Date: Sun, 17 Jan 2021 12:08:07 GMT
- Title: Rotated Ring, Radial and Depth Wise Separable Radial Convolutions
- Authors: Wolfgang Fuhl, Enkelejda Kasneci
- Abstract summary: In this work, we address trainable rotation invariant convolutions and the construction of nets.
On the one hand, we show that our approach is rotationally invariant for different models and on different public data sets.
The rotationally adaptive convolution models presented are more computationally intensive than normal convolution models.
- Score: 13.481518628796692
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Simple image rotations significantly reduce the accuracy of deep neural
networks. Moreover, training with all possible rotations increases the data
set, which also increases the training duration. In this work, we address
trainable rotation invariant convolutions as well as the construction of nets,
since fully connected layers can only be rotation invariant with a
one-dimensional input. On the one hand, we show that our approach is
rotationally invariant for different models and on different public data sets.
We also discuss the influence of purely rotational invariant features on
accuracy. The rotationally adaptive convolution models presented in this work
are more computationally intensive than normal convolution models. Therefore,
we also present a depth wise separable approach with radial convolution. Link
to CUDA code
https://atreus.informatik.uni-tuebingen.de/seafile/d/8e2ab8c3fdd444e1a135/
Related papers
- RIC-CNN: Rotation-Invariant Coordinate Convolutional Neural Network [56.42518353373004]
We propose a new convolutional operation, called Rotation-Invariant Coordinate Convolution (RIC-C)
By replacing all standard convolutional layers in a CNN with the corresponding RIC-C, a RIC-CNN can be derived.
It can be observed that RIC-CNN achieves the state-of-the-art classification on the rotated test dataset of MNIST.
arXiv Detail & Related papers (2022-11-21T19:27:02Z) - Orthonormal Convolutions for the Rotation Based Iterative
Gaussianization [64.44661342486434]
This paper elaborates an extension of rotation-based iterative Gaussianization, RBIG, which makes image Gaussianization possible.
In images its application has been restricted to small image patches or isolated pixels, because rotation in RBIG is based on principal or independent component analysis.
We present the emphConvolutional RBIG: an extension that alleviates this issue by imposing that the rotation in RBIG is a convolution.
arXiv Detail & Related papers (2022-06-08T12:56:34Z) - ART-Point: Improving Rotation Robustness of Point Cloud Classifiers via
Adversarial Rotation [89.47574181669903]
In this study, we show that the rotation robustness of point cloud classifiers can also be acquired via adversarial training.
Specifically, our proposed framework named ART-Point regards the rotation of the point cloud as an attack.
We propose a fast one-step optimization to efficiently reach the final robust model.
arXiv Detail & Related papers (2022-03-08T07:20:16Z) - RRL:Regional Rotation Layer in Convolutional Neural Networks [2.131909135487625]
Convolutional Neural Networks (CNNs) perform very well in image classification and object detection.
This paper proposes a module that can be inserted into the existing networks, and directly incorporates the rotation invariance into the feature extraction layers of the CNNs.
This module does not have learnable parameters and will not increase the complexity of the model.
arXiv Detail & Related papers (2022-02-25T06:07:53Z) - Improving the Sample-Complexity of Deep Classification Networks with
Invariant Integration [77.99182201815763]
Leveraging prior knowledge on intraclass variance due to transformations is a powerful method to improve the sample complexity of deep neural networks.
We propose a novel monomial selection algorithm based on pruning methods to allow an application to more complex problems.
We demonstrate the improved sample complexity on the Rotated-MNIST, SVHN and CIFAR-10 datasets.
arXiv Detail & Related papers (2022-02-08T16:16:11Z) - Adjoint Rigid Transform Network: Task-conditioned Alignment of 3D Shapes [86.2129580231191]
Adjoint Rigid Transform (ART) Network is a neural module which can be integrated with a variety of 3D networks.
ART learns to rotate input shapes to a learned canonical orientation, which is crucial for a lot of tasks.
We will release our code and pre-trained models for further research.
arXiv Detail & Related papers (2021-02-01T20:58:45Z) - Rotation-Invariant Autoencoders for Signals on Spheres [10.406659081400354]
We study the problem of unsupervised learning of rotation-invariant representations for spherical images.
In particular, we design an autoencoder architecture consisting of $S2$ and $SO(3)$ convolutional layers.
Experiments on multiple datasets demonstrate the usefulness of the learned representations on clustering, retrieval and classification applications.
arXiv Detail & Related papers (2020-12-08T15:15:03Z) - Rotation-Invariant Point Convolution With Multiple Equivariant
Alignments [1.0152838128195467]
We show that using rotation-equivariant alignments, it is possible to make any convolutional layer rotation-invariant.
With this core layer, we design rotation-invariant architectures which improve state-of-the-art results in both object classification and semantic segmentation.
arXiv Detail & Related papers (2020-12-07T20:47:46Z) - Learnable Gabor modulated complex-valued networks for orientation
robustness [4.024850952459758]
Learnable Gabor Convolutional Networks (LGCNs) are parameter-efficient and offer increased model complexity.
We investigate the robustness of complex valued convolutional weights with learned Gabor filters to enable orientation transformations.
arXiv Detail & Related papers (2020-11-23T21:22:27Z) - Rotated Binary Neural Network [138.89237044931937]
Binary Neural Network (BNN) shows its predominance in reducing the complexity of deep neural networks.
One of the major impediments is the large quantization error between the full-precision weight vector and its binary vector.
We introduce a Rotated Binary Neural Network (RBNN) which considers the angle alignment between the full-precision weight vector and its binarized version.
arXiv Detail & Related papers (2020-09-28T04:22:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.