On Wasserstein distances for affine transformations of random vectors
- URL: http://arxiv.org/abs/2310.03945v2
- Date: Wed, 7 Feb 2024 20:06:32 GMT
- Title: On Wasserstein distances for affine transformations of random vectors
- Authors: Keaton Hamm, Andrzej Korzeniowski
- Abstract summary: We give concrete lower bounds for rotated copies of random vectors in $mathbbR2$.
We derive upper bounds for compositions of affine maps which yield a fruitful variety of diffeomorphisms applied to an initial data measure.
We give a framework for mimicking handwritten digit or alphabet datasets that can be applied in a manifold learning framework.
- Score: 1.2836088204932843
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We expound on some known lower bounds of the quadratic Wasserstein distance
between random vectors in $\mathbb{R}^n$ with an emphasis on affine
transformations that have been used in manifold learning of data in Wasserstein
space. In particular, we give concrete lower bounds for rotated copies of
random vectors in $\mathbb{R}^2$ by computing the Bures metric between the
covariance matrices. We also derive upper bounds for compositions of affine
maps which yield a fruitful variety of diffeomorphisms applied to an initial
data measure. We apply these bounds to various distributions including those
lying on a 1-dimensional manifold in $\mathbb{R}^2$ and illustrate the quality
of the bounds. Finally, we give a framework for mimicking handwritten digit or
alphabet datasets that can be applied in a manifold learning framework.
Related papers
- An Ad-hoc graph node vector embedding algorithm for general knowledge graphs using Kinetica-Graph [0.0]
This paper discusses how to generate general graph node embeddings from knowledge graph representations.
The embedded space is composed of a number of sub-features to mimic both local affinity and remote structural relevance.
arXiv Detail & Related papers (2024-07-22T14:43:10Z) - Reconstructing the Geometry of Random Geometric Graphs [9.004991291124096]
Random geometric graphs are random graph models defined on metric spaces.
We show how to efficiently reconstruct the geometry of the underlying space from the sampled graph.
arXiv Detail & Related papers (2024-02-14T21:34:44Z) - Intrinsic Bayesian Cramér-Rao Bound with an Application to Covariance Matrix Estimation [49.67011673289242]
This paper presents a new performance bound for estimation problems where the parameter to estimate lies in a smooth manifold.
It induces a geometry for the parameter manifold, as well as an intrinsic notion of the estimation error measure.
arXiv Detail & Related papers (2023-11-08T15:17:13Z) - Principal subbundles for dimension reduction [0.07515511160657122]
We show how sub-Riemannian geometry can be used for manifold learning and surface reconstruction.
We show that the framework is robust when applied to noisy data.
arXiv Detail & Related papers (2023-07-06T16:55:21Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Equivalence Between SE(3) Equivariant Networks via Steerable Kernels and
Group Convolution [90.67482899242093]
A wide range of techniques have been proposed in recent years for designing neural networks for 3D data that are equivariant under rotation and translation of the input.
We provide an in-depth analysis of both methods and their equivalence and relate the two constructions to multiview convolutional networks.
We also derive new TFN non-linearities from our equivalence principle and test them on practical benchmark datasets.
arXiv Detail & Related papers (2022-11-29T03:42:11Z) - When Random Tensors meet Random Matrices [50.568841545067144]
This paper studies asymmetric order-$d$ spiked tensor models with Gaussian noise.
We show that the analysis of the considered model boils down to the analysis of an equivalent spiked symmetric textitblock-wise random matrix.
arXiv Detail & Related papers (2021-12-23T04:05:01Z) - A Differential Geometry Perspective on Orthogonal Recurrent Models [56.09491978954866]
We employ tools and insights from differential geometry to offer a novel perspective on orthogonal RNNs.
We show that orthogonal RNNs may be viewed as optimizing in the space of divergence-free vector fields.
Motivated by this observation, we study a new recurrent model, which spans the entire space of vector fields.
arXiv Detail & Related papers (2021-02-18T19:39:22Z) - Manifold learning with arbitrary norms [8.433233101044197]
We show that manifold learning based on Earthmover's distances outperforms the standard Euclidean variant for learning molecular shape spaces.
We show in a numerical simulation that manifold learning based on Earthmover's distances outperforms the standard Euclidean variant for learning molecular shape spaces.
arXiv Detail & Related papers (2020-12-28T10:24:30Z) - A diffusion approach to Stein's method on Riemannian manifolds [65.36007959755302]
We exploit the relationship between the generator of a diffusion on $mathbf M$ with target invariant measure and its characterising Stein operator.
We derive Stein factors, which bound the solution to the Stein equation and its derivatives.
We imply that the bounds for $mathbb Rm$ remain valid when $mathbf M$ is a flat manifold.
arXiv Detail & Related papers (2020-03-25T17:03:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.