Transport f divergences
- URL: http://arxiv.org/abs/2504.15515v2
- Date: Wed, 23 Apr 2025 01:18:37 GMT
- Title: Transport f divergences
- Authors: Wuchen Li,
- Abstract summary: We define a class of divergences to measure differences between probability density functions in one-dimensional sample space.<n>The construction is based on the convex function with the Jacobi operator of mapping function that pushforwards one density to the other.
- Score: 2.817412580574242
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We define a class of divergences to measure differences between probability density functions in one-dimensional sample space. The construction is based on the convex function with the Jacobi operator of mapping function that pushforwards one density to the other. We call these information measures transport f-divergences. We present several properties of transport $f$-divergences, including invariances, convexities, variational formulations, and Taylor expansions in terms of mapping functions. Examples of transport f-divergences in generative models are provided.
Related papers
- Unbiased Estimating Equation on Inverse Divergence and Its Conditions [0.10742675209112622]
This paper focuses on the Bregman divergence defined by the reciprocal function, called the inverse divergence.
For the loss function defined by the monotonically increasing function $f$ and inverse divergence, the conditions for the statistical model and function $f$ under which the estimating equation is unbiased are clarified.
arXiv Detail & Related papers (2024-04-25T11:22:48Z) - Theoretical Insights for Diffusion Guidance: A Case Study for Gaussian
Mixture Models [59.331993845831946]
Diffusion models benefit from instillation of task-specific information into the score function to steer the sample generation towards desired properties.
This paper provides the first theoretical study towards understanding the influence of guidance on diffusion models in the context of Gaussian mixture models.
arXiv Detail & Related papers (2024-03-03T23:15:48Z) - Functional Diffusion [55.251174506648454]
We propose a new class of generative diffusion models, called functional diffusion.
functional diffusion can be seen as an extension of classical diffusion models to an infinite-dimensional domain.
We show generative results on complicated signed distance functions and deformation functions defined on 3D surfaces.
arXiv Detail & Related papers (2023-11-26T21:35:34Z) - Conditional Optimal Transport on Function Spaces [53.9025059364831]
We develop a theory of constrained optimal transport problems that describe block-triangular Monge maps.
This generalizes the theory of optimal triangular transport to separable infinite-dimensional function spaces with general cost functions.
We present numerical experiments that demonstrate the computational applicability of our theoretical results for amortized and likelihood-free inference of functional parameters.
arXiv Detail & Related papers (2023-11-09T18:44:42Z) - Manifold Diffusion Fields [11.4726574705951]
We present an approach that unlocks learning of diffusion models of data in non-Euclidean geometries.
We define an intrinsic coordinate system on the manifold via the eigen-functions of the Laplace-Beltrami Operator.
We show that MDF can capture distributions of such functions with better diversity and fidelity than previous approaches.
arXiv Detail & Related papers (2023-05-24T21:42:45Z) - The Inverse of Exact Renormalization Group Flows as Statistical Inference [0.0]
We build on the view of the Exact Renormalization Group (ERG) as an instantiation of Optimal Transport.
We provide a new information theoretic perspective for understanding the ERG through the intermediary of Bayesian Statistical Inference.
arXiv Detail & Related papers (2022-12-21T21:38:34Z) - The Manifold Scattering Transform for High-Dimensional Point Cloud Data [16.500568323161563]
We present practical schemes for implementing the manifold scattering transform to datasets arising in naturalistic systems.
We show that our methods are effective for signal classification and manifold classification tasks.
arXiv Detail & Related papers (2022-06-21T02:15:00Z) - Wrapped Distributions on homogeneous Riemannian manifolds [58.720142291102135]
Control over distributions' properties, such as parameters, symmetry and modality yield a family of flexible distributions.
We empirically validate our approach by utilizing our proposed distributions within a variational autoencoder and a latent space network model.
arXiv Detail & Related papers (2022-04-20T21:25:21Z) - Measuring dissimilarity with diffeomorphism invariance [94.02751799024684]
We introduce DID, a pairwise dissimilarity measure applicable to a wide range of data spaces.
We prove that DID enjoys properties which make it relevant for theoretical study and practical use.
arXiv Detail & Related papers (2022-02-11T13:51:30Z) - Variational Transport: A Convergent Particle-BasedAlgorithm for Distributional Optimization [106.70006655990176]
A distributional optimization problem arises widely in machine learning and statistics.
We propose a novel particle-based algorithm, dubbed as variational transport, which approximately performs Wasserstein gradient descent.
We prove that when the objective function satisfies a functional version of the Polyak-Lojasiewicz (PL) (Polyak, 1963) and smoothness conditions, variational transport converges linearly.
arXiv Detail & Related papers (2020-12-21T18:33:13Z) - $(f,\Gamma)$-Divergences: Interpolating between $f$-Divergences and
Integral Probability Metrics [6.221019624345409]
We develop a framework for constructing information-theoretic divergences that subsume both $f$-divergences and integral probability metrics (IPMs)
We show that they can be expressed as a two-stage mass-redistribution/mass-transport process.
Using statistical learning as an example, we demonstrate their advantage in training generative adversarial networks (GANs) for heavy-tailed, not-absolutely continuous sample distributions.
arXiv Detail & Related papers (2020-11-11T18:17:09Z) - Linear Optimal Transport Embedding: Provable Wasserstein classification
for certain rigid transformations and perturbations [79.23797234241471]
Discriminating between distributions is an important problem in a number of scientific fields.
The Linear Optimal Transportation (LOT) embeds the space of distributions into an $L2$-space.
We demonstrate the benefits of LOT on a number of distribution classification problems.
arXiv Detail & Related papers (2020-08-20T19:09:33Z) - Deep composition of tensor-trains using squared inverse Rosenblatt
transports [0.6091702876917279]
This paper generalises the functional tensor-train approximation of the inverse Rosenblatt transport.
We develop an efficient procedure to compute this transport from a squared tensor-train decomposition.
The resulting deep inverse Rosenblatt transport significantly expands the capability of tensor approximations and transport maps to random variables.
arXiv Detail & Related papers (2020-07-14T11:04:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.