Generalizing Complex/Hyper-complex Convolutions to Vector Map
Convolutions
- URL: http://arxiv.org/abs/2009.04083v1
- Date: Wed, 9 Sep 2020 03:00:03 GMT
- Title: Generalizing Complex/Hyper-complex Convolutions to Vector Map
Convolutions
- Authors: Chase J Gaudet and Anthony S Maida
- Abstract summary: We show that complex and hypercomplex valued neural networks offer improvements over their real-valued counterparts.
We introduce novel vector map convolutions which capture both of these properties.
We perform three experiments to show that these novel vector map convolutions seem to capture all the benefits of complex and hyper-complex networks.
- Score: 1.370633147306388
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We show that the core reasons that complex and hypercomplex valued neural
networks offer improvements over their real-valued counterparts is the weight
sharing mechanism and treating multidimensional data as a single entity. Their
algebra linearly combines the dimensions, making each dimension related to the
others. However, both are constrained to a set number of dimensions, two for
complex and four for quaternions. Here we introduce novel vector map
convolutions which capture both of these properties provided by
complex/hypercomplex convolutions, while dropping the unnatural dimensionality
constraints they impose. This is achieved by introducing a system that mimics
the unique linear combination of input dimensions, such as the Hamilton product
for quaternions. We perform three experiments to show that these novel vector
map convolutions seem to capture all the benefits of complex and hyper-complex
networks, such as their ability to capture internal latent relations, while
avoiding the dimensionality restriction.
Related papers
- Multifractal dimensions for orthogonal-to-unitary crossover ensemble [1.0793830805346494]
We show that finite-size versions of multifractal dimensions converge to unity logarithmically slowly with increase in the system size $N$.
We apply our results to analyze the multifractal dimensions in a quantum kicked rotor, a Sinai billiard system, and a correlated spin chain model in a random field.
arXiv Detail & Related papers (2023-10-05T13:22:43Z) - Generalized Volume Complexity in Gauss-Bonnet Gravity: Constraints and
Phase Transitions [5.708951835302518]
It has been proposed that quantum complexity is dual to the volume of the extremal surface, the action of the Wheeler-DeWitt patch, and the spacetime volume of the patch.
A generalized volume-complexity observable was formulated as an equivalently good candidate for the dual holographic complexity.
We demonstrate that this proposal guarantees the linear growth of the generalized volume at late times, regardless of the coupling parameters for four-dimensional Gauss-Bonnet gravity.
arXiv Detail & Related papers (2023-07-24T05:26:39Z) - On the Matrix Form of the Quaternion Fourier Transform and Quaternion Convolution [6.635903943457569]
We study matrix forms of quaternionic versions of the Fourier Transform and Convolution operations.
Quaternions offer a powerful representation unit, however they are related to difficulties in their use that stem foremost from non-commutativity of quaternion multiplication.
arXiv Detail & Related papers (2023-07-04T17:28:58Z) - Polyhedral Complex Extraction from ReLU Networks using Edge Subdivision [0.0]
A neural network consists of piecewise affine building blocks, such as fully-connected layers and ReLU activations.
This complex has been previously studied to characterize theoretical properties of neural networks.
We propose to subdivide the regions via intersections with hyperplanes induced by each neuron.
arXiv Detail & Related papers (2023-06-12T16:17:04Z) - Oracle-Preserving Latent Flows [58.720142291102135]
We develop a methodology for the simultaneous discovery of multiple nontrivial continuous symmetries across an entire labelled dataset.
The symmetry transformations and the corresponding generators are modeled with fully connected neural networks trained with a specially constructed loss function.
The two new elements in this work are the use of a reduced-dimensionality latent space and the generalization to transformations invariant with respect to high-dimensional oracles.
arXiv Detail & Related papers (2023-02-02T00:13:32Z) - Neural Wavelet-domain Diffusion for 3D Shape Generation, Inversion, and
Manipulation [54.09274684734721]
We present a new approach for 3D shape generation, inversion, and manipulation, through a direct generative modeling on a continuous implicit representation in wavelet domain.
Specifically, we propose a compact wavelet representation with a pair of coarse and detail coefficient volumes to implicitly represent 3D shapes via truncated signed distance functions and multi-scale biorthogonal wavelets.
We may jointly train an encoder network to learn a latent space for inverting shapes, allowing us to enable a rich variety of whole-shape and region-aware shape manipulations.
arXiv Detail & Related papers (2023-02-01T02:47:53Z) - Exploring the Adjugate Matrix Approach to Quaternion Pose Extraction [0.0]
Quaternions are important for a wide variety of rotation-related problems in computer graphics, machine vision, and robotics.
We study the nontrivial geometry of the relationship between quaternions and rotation matrices by exploiting the adjugate matrix of the characteristic equation of a related eigenvalue problem.
We find an exact solution to the 3D orthographic least squares pose extraction problem, and apply it successfully also to the perspective pose extraction problem with results that improve on existing methods.
arXiv Detail & Related papers (2022-05-17T23:20:55Z) - A Scalable Combinatorial Solver for Elastic Geometrically Consistent 3D
Shape Matching [69.14632473279651]
We present a scalable algorithm for globally optimizing over the space of geometrically consistent mappings between 3D shapes.
We propose a novel primal coupled with a Lagrange dual problem that is several orders of magnitudes faster than previous solvers.
arXiv Detail & Related papers (2022-04-27T09:47:47Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - NeuroMorph: Unsupervised Shape Interpolation and Correspondence in One
Go [109.88509362837475]
We present NeuroMorph, a new neural network architecture that takes as input two 3D shapes.
NeuroMorph produces smooth and point-to-point correspondences between them.
It works well for a large variety of input shapes, including non-isometric pairs from different object categories.
arXiv Detail & Related papers (2021-06-17T12:25:44Z) - Beyond Fully-Connected Layers with Quaternions: Parameterization of
Hypercomplex Multiplications with $1/n$ Parameters [71.09633069060342]
We propose parameterizing hypercomplex multiplications, allowing models to learn multiplication rules from data regardless of whether such rules are predefined.
Our method not only subsumes the Hamilton product, but also learns to operate on any arbitrary nD hypercomplex space.
arXiv Detail & Related papers (2021-02-17T06:16:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.