A prior-based approximate latent Riemannian metric
- URL: http://arxiv.org/abs/2103.05290v1
- Date: Tue, 9 Mar 2021 08:31:52 GMT
- Title: A prior-based approximate latent Riemannian metric
- Authors: Georgios Arvanitidis, Bogdan Georgiev, Bernhard Sch\"olkopf
- Abstract summary: We propose a surrogate conformal generative metric in the latent space of a generative model that is simple, efficient and robust.
We theoretically analyze the behavior of the proposed metric and show that it is sensible to use in practice.
Also, we show the applicability of the proposed methodology for data analysis in the life sciences.
- Score: 3.716965622352967
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Stochastic generative models enable us to capture the geometric structure of
a data manifold lying in a high dimensional space through a Riemannian metric
in the latent space. However, its practical use is rather limited mainly due to
inevitable complexity. In this work we propose a surrogate conformal Riemannian
metric in the latent space of a generative model that is simple, efficient and
robust. This metric is based on a learnable prior that we propose to learn
using a basic energy-based model. We theoretically analyze the behavior of the
proposed metric and show that it is sensible to use in practice. We demonstrate
experimentally the efficiency and robustness, as well as the behavior of the
new approximate metric. Also, we show the applicability of the proposed
methodology for data analysis in the life sciences.
Related papers
- Score-based pullback Riemannian geometry [10.649159213723106]
We propose a framework for data-driven Riemannian geometry that is scalable in both geometry and learning.
We produce high-quality geodesics through the data support and reliably estimates the intrinsic dimension of the data manifold.
Our framework can naturally be used with anisotropic normalizing flows by adopting isometry regularization during training.
arXiv Detail & Related papers (2024-10-02T18:52:12Z) - Isometric Immersion Learning with Riemannian Geometry [4.987314374901577]
There is still no manifold learning method that provides a theoretical guarantee of isometry.
Inspired by Nash's isometric theorem, we introduce a new concept called isometric immersion learning.
An unsupervised neural network-based model that simultaneously achieves metric and manifold learning is proposed.
arXiv Detail & Related papers (2024-09-23T07:17:06Z) - (Deep) Generative Geodesics [57.635187092922976]
We introduce a newian metric to assess the similarity between any two data points.
Our metric leads to the conceptual definition of generative distances and generative geodesics.
Their approximations are proven to converge to their true values under mild conditions.
arXiv Detail & Related papers (2024-07-15T21:14:02Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - An evaluation framework for dimensionality reduction through sectional
curvature [59.40521061783166]
In this work, we aim to introduce the first highly non-supervised dimensionality reduction performance metric.
To test its feasibility, this metric has been used to evaluate the performance of the most commonly used dimension reduction algorithms.
A new parameterized problem instance generator has been constructed in the form of a function generator.
arXiv Detail & Related papers (2023-03-17T11:59:33Z) - Identifying latent distances with Finslerian geometry [6.0188611984807245]
Generative models cause the data space and the geodesics to be at best impractical, and at worst impossible to manipulate.
In this work, we propose another metric whose geodesics explicitly minimise the expected length of the pullback metric.
In high dimensions, we prove that both metrics converge to each other at a rate of $Oleft(frac1Dright)$.
arXiv Detail & Related papers (2022-12-20T05:57:27Z) - Riemannian Metric Learning via Optimal Transport [34.557360177483595]
We introduce an optimal transport-based model for learning a metric from cross-sectional samples of evolving probability measures.
We show that metrics learned using our method improve the quality of trajectory inference on scRNA and bird migration data.
arXiv Detail & Related papers (2022-05-18T23:32:20Z) - Nonparametric Functional Analysis of Generalized Linear Models Under
Nonlinear Constraints [0.0]
This article introduces a novel nonparametric methodology for Generalized Linear Models.
It combines the strengths of the binary regression and latent variable formulations for categorical data.
It extends recently published parametric versions of the methodology and generalizes it.
arXiv Detail & Related papers (2021-10-11T04:49:59Z) - GELATO: Geometrically Enriched Latent Model for Offline Reinforcement
Learning [54.291331971813364]
offline reinforcement learning approaches can be divided into proximal and uncertainty-aware methods.
In this work, we demonstrate the benefit of combining the two in a latent variational model.
Our proposed metrics measure both the quality of out of distribution samples as well as the discrepancy of examples in the data.
arXiv Detail & Related papers (2021-02-22T19:42:40Z) - Bayesian Quadrature on Riemannian Data Manifolds [79.71142807798284]
A principled way to model nonlinear geometric structure inherent in data is provided.
However, these operations are typically computationally demanding.
In particular, we focus on Bayesian quadrature (BQ) to numerically compute integrals over normal laws.
We show that by leveraging both prior knowledge and an active exploration scheme, BQ significantly reduces the number of required evaluations.
arXiv Detail & Related papers (2021-02-12T17:38:04Z) - On the minmax regret for statistical manifolds: the role of curvature [68.8204255655161]
Two-part codes and the minimum description length have been successful in delivering procedures to single out the best models.
We derive a sharper expression than the standard one given by the complexity, where the scalar curvature of the Fisher information metric plays a dominant role.
arXiv Detail & Related papers (2020-07-06T17:28:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.