Enabling Probabilistic Learning on Manifolds through Double Diffusion Maps
- URL: http://arxiv.org/abs/2506.02254v1
- Date: Mon, 02 Jun 2025 20:58:49 GMT
- Title: Enabling Probabilistic Learning on Manifolds through Double Diffusion Maps
- Authors: Dimitris G Giovanis, Nikolaos Evangelou, Ioannis G Kevrekidis, Roger G Ghanem,
- Abstract summary: We present a generative learning framework for probabilistic sampling based on an extension of the Probabilistic Learning on Manifolds (PLoM) approach.<n>We solve a full-order ISDE directly in the latent space, preserving the full dynamical complexity of the system.
- Score: 3.081704060720176
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a generative learning framework for probabilistic sampling based on an extension of the Probabilistic Learning on Manifolds (PLoM) approach, which is designed to generate statistically consistent realizations of a random vector in a finite-dimensional Euclidean space, informed by a limited (yet representative) set of observations. In its original form, PLoM constructs a reduced-order probabilistic model by combining three main components: (a) kernel density estimation to approximate the underlying probability measure, (b) Diffusion Maps to uncover the intrinsic low-dimensional manifold structure, and (c) a reduced-order Ito Stochastic Differential Equation (ISDE) to sample from the learned distribution. A key challenge arises, however, when the number of available data points N is small and the dimensionality of the diffusion-map basis approaches N, resulting in overfitting and loss of generalization. To overcome this limitation, we propose an enabling extension that implements a synthesis of Double Diffusion Maps -- a technique capable of capturing multiscale geometric features of the data -- with Geometric Harmonics (GH), a nonparametric reconstruction method that allows smooth nonlinear interpolation in high-dimensional ambient spaces. This approach enables us to solve a full-order ISDE directly in the latent space, preserving the full dynamical complexity of the system, while leveraging its reduced geometric representation. The effectiveness and robustness of the proposed method are illustrated through two numerical studies: one based on data generated from two-dimensional Hermite polynomial functions and another based on high-fidelity simulations of a detonation wave in a reactive flow.
Related papers
- Guided Diffusion Sampling on Function Spaces with Applications to PDEs [111.87523128566781]
We propose a general framework for conditional sampling in PDE-based inverse problems.<n>This is accomplished by a function-space diffusion model and plug-and-play guidance for conditioning.<n>Our method achieves an average 32% accuracy improvement over state-of-the-art fixed-resolution diffusion baselines.
arXiv Detail & Related papers (2025-05-22T17:58:12Z) - Improving the Euclidean Diffusion Generation of Manifold Data by Mitigating Score Function Singularity [7.062379942776126]
We investigate direct sampling of Euclidean diffusion models for general manifold-constrained data.<n>We reveal the multiscale singularity of the score function in the embedded space of manifold, which hinders the accuracy of diffusion-generated samples.<n>We propose two novel methods to mitigate the singularity and improve the sampling accuracy.
arXiv Detail & Related papers (2025-05-15T03:12:27Z) - Riemannian Denoising Diffusion Probabilistic Models [7.964790563398277]
We propose RDDPMs for learning distributions on submanifolds of Euclidean space that are level sets of functions.<n>We provide a theoretical analysis of our method in the continuous-time limit.<n>The capability of our method is demonstrated on datasets from previous studies and on new sampled datasets.
arXiv Detail & Related papers (2025-05-07T11:37:16Z) - Proper Latent Decomposition [4.266376725904727]
We compute a reduced set of intrinsic coordinates (latent space) to accurately describe a flow with fewer degrees of freedom than the numerical discretization.<n>With this proposed numerical framework, we propose an algorithm to perform PLD on the manifold.<n>This work opens opportunities for analyzing autoencoders and latent spaces, nonlinear reduced-order modeling and scientific insights into the structure of high-dimensional data.
arXiv Detail & Related papers (2024-12-01T12:19:08Z) - Total Uncertainty Quantification in Inverse PDE Solutions Obtained with Reduced-Order Deep Learning Surrogate Models [50.90868087591973]
We propose an approximate Bayesian method for quantifying the total uncertainty in inverse PDE solutions obtained with machine learning surrogate models.
We test the proposed framework by comparing it with the iterative ensemble smoother and deep ensembling methods for a non-linear diffusion equation.
arXiv Detail & Related papers (2024-08-20T19:06:02Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Decomposed Diffusion Sampler for Accelerating Large-Scale Inverse
Problems [64.29491112653905]
We propose a novel and efficient diffusion sampling strategy that synergistically combines the diffusion sampling and Krylov subspace methods.
Specifically, we prove that if tangent space at a denoised sample by Tweedie's formula forms a Krylov subspace, then the CG with the denoised data ensures the data consistency update to remain in the tangent space.
Our proposed method achieves more than 80 times faster inference time than the previous state-of-the-art method.
arXiv Detail & Related papers (2023-03-10T07:42:49Z) - Orthogonal Matrix Retrieval with Spatial Consensus for 3D Unknown-View
Tomography [58.60249163402822]
Unknown-view tomography (UVT) reconstructs a 3D density map from its 2D projections at unknown, random orientations.
The proposed OMR is more robust and performs significantly better than the previous state-of-the-art OMR approach.
arXiv Detail & Related papers (2022-07-06T21:40:59Z) - Reinforcement Learning from Partial Observation: Linear Function Approximation with Provable Sample Efficiency [111.83670279016599]
We study reinforcement learning for partially observed decision processes (POMDPs) with infinite observation and state spaces.
We make the first attempt at partial observability and function approximation for a class of POMDPs with a linear structure.
arXiv Detail & Related papers (2022-04-20T21:15:38Z) - Manifold learning-based polynomial chaos expansions for high-dimensional
surrogate models [0.0]
We introduce a manifold learning-based method for uncertainty quantification (UQ) in describing systems.
The proposed method is able to achieve highly accurate approximations which ultimately lead to the significant acceleration of UQ tasks.
arXiv Detail & Related papers (2021-07-21T00:24:15Z) - Improving Metric Dimensionality Reduction with Distributed Topology [68.8204255655161]
DIPOLE is a dimensionality-reduction post-processing step that corrects an initial embedding by minimizing a loss functional with both a local, metric term and a global, topological term.
We observe that DIPOLE outperforms popular methods like UMAP, t-SNE, and Isomap on a number of popular datasets.
arXiv Detail & Related papers (2021-06-14T17:19:44Z) - Posterior-Aided Regularization for Likelihood-Free Inference [23.708122045184698]
Posterior-Aided Regularization (PAR) is applicable to learning the density estimator, regardless of the model structure.
We provide a unified estimation method of PAR to estimate both reverse KL term and mutual information term with a single neural network.
arXiv Detail & Related papers (2021-02-15T16:59:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.