Probability Density Geodesics in Image Diffusion Latent Space
- URL: http://arxiv.org/abs/2504.06675v1
- Date: Wed, 09 Apr 2025 08:28:53 GMT
- Title: Probability Density Geodesics in Image Diffusion Latent Space
- Authors: Qingtao Yu, Jaskirat Singh, Zhaoyuan Yang, Peter Henry Tu, Jing Zhang, Hongdong Li, Richard Hartley, Dylan Campbell,
- Abstract summary: We show that geodesic diffusions can be computed in latent space.<n>We analyze how closely video clips approximate geodesics in a pre-trained image diffusion space.
- Score: 57.99700072218375
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Diffusion models indirectly estimate the probability density over a data space, which can be used to study its structure. In this work, we show that geodesics can be computed in diffusion latent space, where the norm induced by the spatially-varying inner product is inversely proportional to the probability density. In this formulation, a path that traverses a high density (that is, probable) region of image latent space is shorter than the equivalent path through a low density region. We present algorithms for solving the associated initial and boundary value problems and show how to compute the probability density along the path and the geodesic distance between two points. Using these techniques, we analyze how closely video clips approximate geodesics in a pre-trained image diffusion space. Finally, we demonstrate how these techniques can be applied to training-free image sequence interpolation and extrapolation, given a pre-trained image diffusion model.
Related papers
- Generative Learning of Densities on Manifolds [3.081704060720176]
A generative modeling framework is proposed that combines diffusion models and manifold learning.
The approach utilizes Diffusion Maps to uncover possible low-dimensional underlying (latent) spaces in the high-dimensional data (ambient) space.
arXiv Detail & Related papers (2025-03-05T23:29:06Z) - Density Ratio Estimation via Sampling along Generalized Geodesics on Statistical Manifolds [0.951494089949975]
We geometrically reinterpret existing methods for density ratio estimation based on incremental mixtures.
To achieve such a method requires Monte Carlo sampling along geodesics via transformations of the two distributions.
Our experiments demonstrate that the proposed approach outperforms the existing approaches.
arXiv Detail & Related papers (2024-06-27T00:44:46Z) - A Heat Diffusion Perspective on Geodesic Preserving Dimensionality
Reduction [66.21060114843202]
We propose a more general heat kernel based manifold embedding method that we call heat geodesic embeddings.
Results show that our method outperforms existing state of the art in preserving ground truth manifold distances.
We also showcase our method on single cell RNA-sequencing datasets with both continuum and cluster structure.
arXiv Detail & Related papers (2023-05-30T13:58:50Z) - Score Approximation, Estimation and Distribution Recovery of Diffusion
Models on Low-Dimensional Data [68.62134204367668]
This paper studies score approximation, estimation, and distribution recovery of diffusion models, when data are supported on an unknown low-dimensional linear subspace.
We show that with a properly chosen neural network architecture, the score function can be both accurately approximated and efficiently estimated.
The generated distribution based on the estimated score function captures the data geometric structures and converges to a close vicinity of the data distribution.
arXiv Detail & Related papers (2023-02-14T17:02:35Z) - Density-Based Clustering with Kernel Diffusion [59.4179549482505]
A naive density corresponding to the indicator function of a unit $d$-dimensional Euclidean ball is commonly used in density-based clustering algorithms.
We propose a new kernel diffusion density function, which is adaptive to data of varying local distributional characteristics and smoothness.
arXiv Detail & Related papers (2021-10-11T09:00:33Z) - Featurized Density Ratio Estimation [82.40706152910292]
In our work, we propose to leverage an invertible generative model to map the two distributions into a common feature space prior to estimation.
This featurization brings the densities closer together in latent space, sidestepping pathological scenarios where the learned density ratios in input space can be arbitrarily inaccurate.
At the same time, the invertibility of our feature map guarantees that the ratios computed in feature space are equivalent to those in input space.
arXiv Detail & Related papers (2021-07-05T18:30:26Z) - Large-Scale Wasserstein Gradient Flows [84.73670288608025]
We introduce a scalable scheme to approximate Wasserstein gradient flows.
Our approach relies on input neural networks (ICNNs) to discretize the JKO steps.
As a result, we can sample from the measure at each step of the gradient diffusion and compute its density.
arXiv Detail & Related papers (2021-06-01T19:21:48Z) - Anomaly Detection with Density Estimation [2.0813318162800707]
We propose a new unsupervised anomaly detection technique (ANODE)
By estimating the probability density of the data in a signal region and in sidebands, a likelihood ratio of data vs. background can be constructed.
ANODE is robust against systematic differences between signal region and sidebands, giving it broader applicability than other methods.
arXiv Detail & Related papers (2020-01-14T19:00:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.