Revisiting the Continuity of Rotation Representations in Neural Networks
- URL: http://arxiv.org/abs/2006.06234v2
- Date: Fri, 12 Jun 2020 04:10:52 GMT
- Title: Revisiting the Continuity of Rotation Representations in Neural Networks
- Authors: Sitao Xiang, Hao Li
- Abstract summary: We analyze certain pathological behavior of Euler angles and unit quaternions encountered in previous works related to rotation representation in neural networks.
We show that this behavior is inherent in the topological property of the problem itself and is not caused by unsuitable network architectures or training procedures.
- Score: 14.63787408331962
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we provide some careful analysis of certain pathological
behavior of Euler angles and unit quaternions encountered in previous works
related to rotation representation in neural networks. In particular, we show
that for certain problems, these two representations will provably produce
completely wrong results for some inputs, and that this behavior is inherent in
the topological property of the problem itself and is not caused by unsuitable
network architectures or training procedures. We further show that previously
proposed embeddings of $\mathrm{SO}(3)$ into higher dimensional Euclidean
spaces aimed at fixing this behavior are not universally effective, due to
possible symmetry in the input causing changes to the topology of the input
space. We propose an ensemble trick as an alternative solution.
Related papers
- Relative Representations: Topological and Geometric Perspectives [53.88896255693922]
Relative representations are an established approach to zero-shot model stitching.
We introduce a normalization procedure in the relative transformation, resulting in invariance to non-isotropic rescalings and permutations.
Second, we propose to deploy topological densification when fine-tuning relative representations, a topological regularization loss encouraging clustering within classes.
arXiv Detail & Related papers (2024-09-17T08:09:22Z) - Non Commutative Convolutional Signal Models in Neural Networks:
Stability to Small Deformations [111.27636893711055]
We study the filtering and stability properties of non commutative convolutional filters.
Our results have direct implications for group neural networks, multigraph neural networks and quaternion neural networks.
arXiv Detail & Related papers (2023-10-05T20:27:22Z) - Learning Linear Causal Representations from Interventions under General
Nonlinear Mixing [52.66151568785088]
We prove strong identifiability results given unknown single-node interventions without access to the intervention targets.
This is the first instance of causal identifiability from non-paired interventions for deep neural network embeddings.
arXiv Detail & Related papers (2023-06-04T02:32:12Z) - From Tempered to Benign Overfitting in ReLU Neural Networks [41.271773069796126]
Overtrivialized neural networks (NNs) are observed to generalize well even when trained to perfectly fit noisy data.
Recently, it was conjectured and empirically observed that the behavior of NNs is often better described as "tempered overfitting"
arXiv Detail & Related papers (2023-05-24T13:36:06Z) - Deep neural networks can stably solve high-dimensional, noisy,
non-linear inverse problems [2.6651200086513107]
We study the problem of reconstructing solutions of inverse problems when only noisy measurements are available.
For the inverse operator, we demonstrate that there exists a neural network which is a robust-to-noise approximation of the operator.
arXiv Detail & Related papers (2022-06-02T08:51:46Z) - An artificial neural network approach to bifurcating phenomena in
computational fluid dynamics [0.0]
We discuss the POD-NN approach dealing with non-smooth solutions set of nonlinear parametrized PDEs.
We propose a reduced manifold-based bifurcation diagram for a non-intrusive recovery of the critical points evolution.
arXiv Detail & Related papers (2021-09-22T14:42:36Z) - Convolutional Filtering and Neural Networks with Non Commutative
Algebras [153.20329791008095]
We study the generalization of non commutative convolutional neural networks.
We show that non commutative convolutional architectures can be stable to deformations on the space of operators.
arXiv Detail & Related papers (2021-08-23T04:22:58Z) - A neural anisotropic view of underspecification in deep learning [60.119023683371736]
We show that the way neural networks handle the underspecification of problems is highly dependent on the data representation.
Our results highlight that understanding the architectural inductive bias in deep learning is fundamental to address the fairness, robustness, and generalization of these systems.
arXiv Detail & Related papers (2021-04-29T14:31:09Z) - Going beyond p-convolutions to learn grayscale morphological operators [64.38361575778237]
We present two new morphological layers based on the same principle as the p-convolutional layer.
In this work, we present two new morphological layers based on the same principle as the p-convolutional layer.
arXiv Detail & Related papers (2021-02-19T17:22:16Z) - Deep neural network surrogates for non-smooth quantities of interest in
shape uncertainty quantification [0.0]
We focus on an elliptic interface problem and a Helmholtz transmission problem.
Point values of the solution in the physical domain depend in general non-smoothly on the high-dimensional parameter.
We build surrogates for point evaluation using deep neural networks.
arXiv Detail & Related papers (2021-01-18T12:02:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.