A Unifying and Canonical Description of Measure-Preserving Diffusions
- URL: http://arxiv.org/abs/2105.02845v1
- Date: Thu, 6 May 2021 17:36:55 GMT
- Title: A Unifying and Canonical Description of Measure-Preserving Diffusions
- Authors: Alessandro Barp, So Takao, Michael Betancourt, Alexis Arnaudon, Mark
Girolami
- Abstract summary: A complete recipe of measure-preserving diffusions in Euclidean space was recently derived unifying several MCMC algorithms into a single framework.
We develop a geometric theory that improves and generalises this construction to any manifold.
- Score: 60.59592461429012
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A complete recipe of measure-preserving diffusions in Euclidean space was
recently derived unifying several MCMC algorithms into a single framework. In
this paper, we develop a geometric theory that improves and generalises this
construction to any manifold. We thereby demonstrate that the completeness
result is a direct consequence of the topology of the underlying manifold and
the geometry induced by the target measure $P$; there is no need to introduce
other structures such as a Riemannian metric, local coordinates, or a reference
measure. Instead, our framework relies on the intrinsic geometry of $P$ and in
particular its canonical derivative, the deRham rotationnel, which allows us to
parametrise the Fokker--Planck currents of measure-preserving diffusions using
potentials. The geometric formalism can easily incorporate constraints and
symmetries, and deliver new important insights, for example, a new complete
recipe of Langevin-like diffusions that are suited to the construction of
samplers. We also analyse the reversibility and dissipative properties of the
diffusions, the associated deterministic flow on the space of measures, and the
geometry of Langevin processes. Our article connects ideas from various
literature and frames the theory of measure-preserving diffusions in its
appropriate mathematical context.
Related papers
- Bridging Geometric States via Geometric Diffusion Bridge [79.60212414973002]
We introduce the Geometric Diffusion Bridge (GDB), a novel generative modeling framework that accurately bridges initial and target geometric states.
GDB employs an equivariant diffusion bridge derived by a modified version of Doob's $h$-transform for connecting geometric states.
We show that GDB surpasses existing state-of-the-art approaches, opening up a new pathway for accurately bridging geometric states.
arXiv Detail & Related papers (2024-10-31T17:59:53Z) - Sigma Flows for Image and Data Labeling and Learning Structured Prediction [2.4699742392289]
This paper introduces the sigma flow model for the prediction of structured labelings of data observed on Riemannian manifold.
The approach combines the Laplace-Beltrami framework for image denoising and enhancement, introduced by Sochen, Kimmel and Malladi about 25 years ago, and the assignment flow approach introduced and studied by the authors.
arXiv Detail & Related papers (2024-08-28T17:04:56Z) - Topological Obstructions and How to Avoid Them [22.45861345237023]
We show that local optima can arise due to singularities or an incorrect degree or winding number.
We propose a new flow-based model that maps data points to multimodal distributions over geometric spaces.
arXiv Detail & Related papers (2023-12-12T18:56:14Z) - The Fisher-Rao geometry of CES distributions [50.50897590847961]
The Fisher-Rao information geometry allows for leveraging tools from differential geometry.
We will present some practical uses of these geometric tools in the framework of elliptical distributions.
arXiv Detail & Related papers (2023-10-02T09:23:32Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - A singular Riemannian geometry approach to Deep Neural Networks I.
Theoretical foundations [77.86290991564829]
Deep Neural Networks are widely used for solving complex problems in several scientific areas, such as speech recognition, machine translation, image analysis.
We study a particular sequence of maps between manifold, with the last manifold of the sequence equipped with a Riemannian metric.
We investigate the theoretical properties of the maps of such sequence, eventually we focus on the case of maps between implementing neural networks of practical interest.
arXiv Detail & Related papers (2021-12-17T11:43:30Z) - Pulling back information geometry [3.0273878903284266]
We show that we can achieve meaningful latent geometries for a wide range of decoder distributions.
We show that we can achieve meaningful latent geometries for a wide range of decoder distributions.
arXiv Detail & Related papers (2021-06-09T20:16:28Z) - Geometric variational inference [0.0]
Variational Inference (VI) or Markov-Chain Monte-Carlo (MCMC) techniques are used to go beyond point estimates.
This work proposes geometric Variational Inference (geoVI), a method based on Riemannian geometry and the Fisher information metric.
The distribution, expressed in the coordinate system induced by the transformation, takes a particularly simple form that allows for an accurate variational approximation.
arXiv Detail & Related papers (2021-05-21T17:18:50Z) - Disentangling by Subspace Diffusion [72.1895236605335]
We show that fully unsupervised factorization of a data manifold is possible if the true metric of the manifold is known.
Our work reduces the question of whether unsupervised metric learning is possible, providing a unifying insight into the geometric nature of representation learning.
arXiv Detail & Related papers (2020-06-23T13:33:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.