ManifoldNorm: Extending normalizations on Riemannian Manifolds
- URL: http://arxiv.org/abs/2003.13869v2
- Date: Sat, 4 Apr 2020 22:44:32 GMT
- Title: ManifoldNorm: Extending normalizations on Riemannian Manifolds
- Authors: Rudrasis Chakraborty
- Abstract summary: We propose a general normalization techniques for manifold valued data.
We show that our proposed manifold normalization technique have special cases including popular batch norm and group norm techniques.
- Score: 18.073864874996534
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Many measurements in computer vision and machine learning manifest as
non-Euclidean data samples. Several researchers recently extended a number of
deep neural network architectures for manifold valued data samples. Researchers
have proposed models for manifold valued spatial data which are common in
medical image processing including processing of diffusion tensor imaging (DTI)
where images are fields of $3\times 3$ symmetric positive definite matrices or
representation in terms of orientation distribution field (ODF) where the
identification is in terms of field on hypersphere. There are other sequential
models for manifold valued data that recently researchers have shown to be
effective for group difference analysis in study for neuro-degenerative
diseases. Although, several of these methods are effective to deal with
manifold valued data, the bottleneck includes the instability in optimization
for deeper networks. In order to deal with these instabilities, researchers
have proposed residual connections for manifold valued data. One of the other
remedies to deal with the instabilities including gradient explosion is to use
normalization techniques including {\it batch norm} and {\it group norm} etc..
But, so far there is no normalization techniques applicable for manifold valued
data. In this work, we propose a general normalization techniques for manifold
valued data. We show that our proposed manifold normalization technique have
special cases including popular batch norm and group norm techniques. On the
experimental side, we focus on two types of manifold valued data including
manifold of symmetric positive definite matrices and hypersphere. We show the
performance gain in one synthetic experiment for moving MNIST dataset and one
real brain image dataset where the representation is in terms of orientation
distribution field (ODF).
Related papers
- Geodesic Optimization for Predictive Shift Adaptation on EEG data [53.58711912565724]
Domain adaptation methods struggle when distribution shifts occur simultaneously in $X$ and $y$.
This paper proposes a novel method termed Geodesic Optimization for Predictive Shift Adaptation (GOPSA) to address test-time multi-source DA.
GOPSA has the potential to combine the advantages of mixed-effects modeling with machine learning for biomedical applications of EEG.
arXiv Detail & Related papers (2024-07-04T12:15:42Z) - GLAD: Towards Better Reconstruction with Global and Local Adaptive Diffusion Models for Unsupervised Anomaly Detection [60.78684630040313]
Diffusion models tend to reconstruct normal counterparts of test images with certain noises added.
From the global perspective, the difficulty of reconstructing images with different anomalies is uneven.
We propose a global and local adaptive diffusion model (abbreviated to GLAD) for unsupervised anomaly detection.
arXiv Detail & Related papers (2024-06-11T17:27:23Z) - Implicit Manifold Gaussian Process Regression [49.0787777751317]
Gaussian process regression is widely used to provide well-calibrated uncertainty estimates.
It struggles with high-dimensional data because of the implicit low-dimensional manifold upon which the data actually lies.
In this paper we propose a technique capable of inferring implicit structure directly from data (labeled and unlabeled) in a fully differentiable way.
arXiv Detail & Related papers (2023-10-30T09:52:48Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Bayesian Hyperbolic Multidimensional Scaling [2.5944208050492183]
We propose a Bayesian approach to multidimensional scaling when the low-dimensional manifold is hyperbolic.
A case-control likelihood approximation allows for efficient sampling from the posterior distribution in larger data settings.
We evaluate the proposed method against state-of-the-art alternatives using simulations, canonical reference datasets, Indian village network data, and human gene expression data.
arXiv Detail & Related papers (2022-10-26T23:34:30Z) - The Manifold Scattering Transform for High-Dimensional Point Cloud Data [16.500568323161563]
We present practical schemes for implementing the manifold scattering transform to datasets arising in naturalistic systems.
We show that our methods are effective for signal classification and manifold classification tasks.
arXiv Detail & Related papers (2022-06-21T02:15:00Z) - RENs: Relevance Encoding Networks [0.0]
This paper proposes relevance encoding networks (RENs): a novel probabilistic VAE-based framework that uses the automatic relevance determination (ARD) prior in the latent space to learn the data-specific bottleneck dimensionality.
We show that the proposed model learns the relevant latent bottleneck dimensionality without compromising the representation and generation quality of the samples.
arXiv Detail & Related papers (2022-05-25T21:53:48Z) - Manifold Topology Divergence: a Framework for Comparing Data Manifolds [109.0784952256104]
We develop a framework for comparing data manifold, aimed at the evaluation of deep generative models.
Based on the Cross-Barcode, we introduce the Manifold Topology Divergence score (MTop-Divergence)
We demonstrate that the MTop-Divergence accurately detects various degrees of mode-dropping, intra-mode collapse, mode invention, and image disturbance.
arXiv Detail & Related papers (2021-06-08T00:30:43Z) - Diffusion Earth Mover's Distance and Distribution Embeddings [61.49248071384122]
Diffusion can be computed in $tildeO(n)$ time and is more accurate than similarly fast algorithms such as tree-baseds.
We show Diffusion is fully differentiable, making it amenable to future uses in gradient-descent frameworks such as deep neural networks.
arXiv Detail & Related papers (2021-02-25T13:18:32Z) - Flow-based Generative Models for Learning Manifold to Manifold Mappings [39.60406116984869]
We introduce three kinds of invertible layers for manifold-valued data, which are analogous to their functionality in flow-based generative models.
We show promising results where we can reliably and accurately reconstruct brain images of a field of orientation distribution functions.
arXiv Detail & Related papers (2020-12-18T02:19:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.