A Quaternion-Valued Variational Autoencoder
- URL: http://arxiv.org/abs/2010.11647v2
- Date: Thu, 22 Apr 2021 16:23:23 GMT
- Title: A Quaternion-Valued Variational Autoencoder
- Authors: Eleonora Grassucci, Danilo Comminiello, Aurelio Uncini
- Abstract summary: variational autoencoders (VAEs) have proved their ability in modeling a generative process by learning a latent representation of the input.
We propose a novel VAE defined in the quaternion domain, which exploits the properties of quaternion algebra to improve performance.
- Score: 15.153617649974263
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep probabilistic generative models have achieved incredible success in many
fields of application. Among such models, variational autoencoders (VAEs) have
proved their ability in modeling a generative process by learning a latent
representation of the input. In this paper, we propose a novel VAE defined in
the quaternion domain, which exploits the properties of quaternion algebra to
improve performance while significantly reducing the number of parameters
required by the network. The success of the proposed quaternion VAE with
respect to traditional VAEs relies on the ability to leverage the internal
relations between quaternion-valued input features and on the properties of
second-order statistics which allow to define the latent variables in the
augmented quaternion domain. In order to show the advantages due to such
properties, we define a plain convolutional VAE in the quaternion domain and we
evaluate its performance with respect to its real-valued counterpart on the
CelebA face dataset.
Related papers
- Statistical Analysis of the Impact of Quaternion Components in Convolutional Neural Networks [0.5755004576310334]
This paper presents a statistical analysis carried out on experimental data to compare the performance of existing components for the image classification problem.
We introduce a novel Fully Quaternion ReLU activation function, which exploits the unique properties of quaternion algebra to improve model performance.
arXiv Detail & Related papers (2024-08-29T19:13:20Z) - State-Free Inference of State-Space Models: The Transfer Function Approach [132.83348321603205]
State-free inference does not incur any significant memory or computational cost with an increase in state size.
We achieve this using properties of the proposed frequency domain transfer function parametrization.
We report improved perplexity in language modeling over a long convolutional Hyena baseline.
arXiv Detail & Related papers (2024-05-10T00:06:02Z) - Enabling Uncertainty Estimation in Iterative Neural Networks [49.56171792062104]
We develop an approach to uncertainty estimation that provides state-of-the-art estimates at a much lower computational cost than techniques like Ensembles.
We demonstrate its practical value by embedding it in two application domains: road detection in aerial images and the estimation of aerodynamic properties of 2D and 3D shapes.
arXiv Detail & Related papers (2024-03-25T13:06:31Z) - Quaternion-valued Correlation Learning for Few-Shot Semantic
Segmentation [33.88445464404075]
Few-shot segmentation (FSS) aims to segment unseen classes given only a few samples.
We introduce a quaternion perspective on correlation learning and propose a novel Quaternion-valued Correlation Learning Network (QCLNet)
Our QCLNet is formulated as a hyper-complex valued network and represents correlation tensors in the quaternion domain, which uses quaternion-valued convolution to explore the external relations of query subspace.
arXiv Detail & Related papers (2023-05-12T06:56:22Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Quaternion Backpropagation [0.0]
We show that product- and chain-rule does not hold with quaternion backpropagation.
We experimentally prove the functionality of the derived quaternion backpropagation.
arXiv Detail & Related papers (2022-12-26T10:56:19Z) - Instrumental Variable-Driven Domain Generalization with Unobserved
Confounders [53.735614014067394]
Domain generalization (DG) aims to learn from multiple source domains a model that can generalize well on unseen target domains.
We propose an instrumental variable-driven DG method (IV-DG) by removing the bias of the unobserved confounders with two-stage learning.
In the first stage, it learns the conditional distribution of the input features of one domain given input features of another domain.
In the second stage, it estimates the relationship by predicting labels with the learned conditional distribution.
arXiv Detail & Related papers (2021-10-04T13:32:57Z) - InteL-VAEs: Adding Inductive Biases to Variational Auto-Encoders via
Intermediary Latents [60.785317191131284]
We introduce a simple and effective method for learning VAEs with controllable biases by using an intermediary set of latent variables.
In particular, it allows us to impose desired properties like sparsity or clustering on learned representations.
We show that this, in turn, allows InteL-VAEs to learn both better generative models and representations.
arXiv Detail & Related papers (2021-06-25T16:34:05Z) - Quaternion Generative Adversarial Networks [5.156484100374058]
We propose a family of quaternion-valued adversarial networks (QGANs)
QGANs exploit the properties of quaternion algebra, e.g., the Hamilton product for convolutions.
Results show that QGANs are able to generate visually pleasing images and to obtain better FID scores with respect to their real-valued GANs.
arXiv Detail & Related papers (2021-04-19T20:46:18Z) - Quaternion Factorization Machines: A Lightweight Solution to Intricate
Feature Interaction Modelling [76.89779231460193]
factorization machine (FM) is capable of automatically learning high-order interactions among features to make predictions without the need for manual feature engineering.
We propose the quaternion factorization machine (QFM) and quaternion neural factorization machine (QNFM) for sparse predictive analytics.
arXiv Detail & Related papers (2021-04-05T00:02:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.