Rotation-Invariant Gait Identification with Quaternion Convolutional
Neural Networks
- URL: http://arxiv.org/abs/2008.07393v1
- Date: Tue, 4 Aug 2020 23:22:12 GMT
- Title: Rotation-Invariant Gait Identification with Quaternion Convolutional
Neural Networks
- Authors: Bowen Jing, Vinay Prabhu, Angela Gu, John Whaley
- Abstract summary: We introduce Quaternion CNN, a network architecture which is intrinsically layer-wise equivariant and globally invariant under 3D rotations.
We show empirically that this network indeed significantly outperforms a traditional CNN in a multi-user rotation-invariant gait classification setting.
- Score: 7.638280076041963
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A desireable property of accelerometric gait-based identification systems is
robustness to new device orientations presented by users during testing but
unseen during the training phase. However, traditional Convolutional neural
networks (CNNs) used in these systems compensate poorly for such
transformations. In this paper, we target this problem by introducing
Quaternion CNN, a network architecture which is intrinsically layer-wise
equivariant and globally invariant under 3D rotations of an array of input
vectors. We show empirically that this network indeed significantly outperforms
a traditional CNN in a multi-user rotation-invariant gait classification
setting .Lastly, we demonstrate how the kernels learned by this QCNN can also
be visualized as basis-independent but origin- and chirality-dependent
trajectory fragments in the euclidean space, thus yielding a novel mode of
feature visualization and extraction.
Related papers
- Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - The role of data embedding in equivariant quantum convolutional neural
networks [2.255961793913651]
We investigate the role of classical-to-quantum embedding on the performance of equivariant quantum neural networks (EQNNs)
We numerically compare the classification accuracy of EQCNNs with three different basis-permuted amplitude embeddings to the one obtained from a non-equivariant quantum convolutional neural network (QCNN)
arXiv Detail & Related papers (2023-12-20T18:25:15Z) - Permutation Equivariant Neural Functionals [92.0667671999604]
This work studies the design of neural networks that can process the weights or gradients of other neural networks.
We focus on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order.
In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks.
arXiv Detail & Related papers (2023-02-27T18:52:38Z) - Applications of Lattice Gauge Equivariant Neural Networks [0.0]
Lattice Gauge Equivariant Convolutional Neural Networks (L-CNNs)
L-CNNs can generalize better to differently sized lattices than traditional neural networks.
We present our progress on possible applications of L-CNNs to Wilson flow or continuous normalizing flow.
arXiv Detail & Related papers (2022-12-01T19:32:42Z) - Interrelation of equivariant Gaussian processes and convolutional neural
networks [77.34726150561087]
Currently there exists rather promising new trend in machine leaning (ML) based on the relationship between neural networks (NN) and Gaussian processes (GP)
In this work we establish a relationship between the many-channel limit for CNNs equivariant with respect to two-dimensional Euclidean group with vector-valued neuron activations and the corresponding independently introduced equivariant Gaussian processes (GP)
arXiv Detail & Related papers (2022-09-17T17:02:35Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Implicit Equivariance in Convolutional Networks [1.911678487931003]
Implicitly Equivariant Networks (IEN) induce equivariant in the different layers of a standard CNN model.
We show IEN outperforms the state-of-the-art rotation equivariant tracking method while providing faster inference speed.
arXiv Detail & Related papers (2021-11-28T14:44:17Z) - Designing Rotationally Invariant Neural Networks from PDEs and
Variational Methods [8.660429288575367]
We investigate how diffusion and variational models achieve rotation invariance and transfer these ideas to neural networks.
We propose activation functions which couple network channels by combining information from several oriented filters.
Our findings help to translate diffusion and variational models into mathematically well-grained network architectures, and provide novel concepts for model-based CNN design.
arXiv Detail & Related papers (2021-08-31T17:34:40Z) - Spherical Transformer: Adapting Spherical Signal to CNNs [53.18482213611481]
Spherical Transformer can transform spherical signals into vectors that can be directly processed by standard CNNs.
We evaluate our approach on the tasks of spherical MNIST recognition, 3D object classification and omnidirectional image semantic segmentation.
arXiv Detail & Related papers (2021-01-11T12:33:16Z) - Lattice gauge equivariant convolutional neural networks [0.0]
We propose Lattice gauge equivariant Convolutional Neural Networks (L-CNNs) for generic machine learning applications.
We show that L-CNNs can learn and generalize gauge invariant quantities that traditional convolutional neural networks are incapable of finding.
arXiv Detail & Related papers (2020-12-23T19:00:01Z) - Volumetric Transformer Networks [88.85542905676712]
We introduce a learnable module, the volumetric transformer network (VTN)
VTN predicts channel-wise warping fields so as to reconfigure intermediate CNN features spatially and channel-wisely.
Our experiments show that VTN consistently boosts the features' representation power and consequently the networks' accuracy on fine-grained image recognition and instance-level image retrieval.
arXiv Detail & Related papers (2020-07-18T14:00:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.