Isometric Gaussian Process Latent Variable Model for Dissimilarity Data
- URL: http://arxiv.org/abs/2006.11741v2
- Date: Tue, 8 Jun 2021 15:45:54 GMT
- Title: Isometric Gaussian Process Latent Variable Model for Dissimilarity Data
- Authors: Martin J{\o}rgensen and S{\o}ren Hauberg
- Abstract summary: We present a probabilistic model where the latent variable respects both the distances and the topology of the modeled data.
The model is inferred by variational inference based on observations of pairwise distances.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a probabilistic model where the latent variable respects both the
distances and the topology of the modeled data. The model leverages the
Riemannian geometry of the generated manifold to endow the latent space with a
well-defined stochastic distance measure, which is modeled locally as Nakagami
distributions. These stochastic distances are sought to be as similar as
possible to observed distances along a neighborhood graph through a censoring
process. The model is inferred by variational inference based on observations
of pairwise distances. We demonstrate how the new model can encode invariances
in the learned manifolds.
Related papers
- Reconstructing Galaxy Cluster Mass Maps using Score-based Generative Modeling [9.386611764730791]
We present a novel approach to reconstruct gas and dark matter projected density maps of galaxy clusters using score-based generative modeling.
Our diffusion model takes in mock SZ and X-ray images as conditional observations, and generates realizations of corresponding gas and dark matter maps by sampling from a learned data posterior.
arXiv Detail & Related papers (2024-10-03T18:00:03Z) - Latent diffusion models for parameterization and data assimilation of facies-based geomodels [0.0]
Diffusion models are trained to generate new geological realizations from input fields characterized by random noise.
Latent diffusion models are shown to provide realizations that are visually consistent with samples from geomodeling software.
arXiv Detail & Related papers (2024-06-21T01:32:03Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Manifold-augmented Eikonal Equations: Geodesic Distances and Flows on
Differentiable Manifolds [5.0401589279256065]
We show how the geometry of a manifold impacts the distance field, and exploit the geodesic flow to obtain globally length-minimising curves directly.
This work opens opportunities for statistics and reduced-order modelling on differentiable manifold.
arXiv Detail & Related papers (2023-10-09T21:11:13Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Score Approximation, Estimation and Distribution Recovery of Diffusion
Models on Low-Dimensional Data [68.62134204367668]
This paper studies score approximation, estimation, and distribution recovery of diffusion models, when data are supported on an unknown low-dimensional linear subspace.
We show that with a properly chosen neural network architecture, the score function can be both accurately approximated and efficiently estimated.
The generated distribution based on the estimated score function captures the data geometric structures and converges to a close vicinity of the data distribution.
arXiv Detail & Related papers (2023-02-14T17:02:35Z) - Counting Phases and Faces Using Bayesian Thermodynamic Integration [77.34726150561087]
We introduce a new approach to reconstruction of the thermodynamic functions and phase boundaries in two-parametric statistical mechanics systems.
We use the proposed approach to accurately reconstruct the partition functions and phase diagrams of the Ising model and the exactly solvable non-equilibrium TASEP.
arXiv Detail & Related papers (2022-05-18T17:11:23Z) - A Model for Multi-View Residual Covariances based on Perspective
Deformation [88.21738020902411]
We derive a model for the covariance of the visual residuals in multi-view SfM, odometry and SLAM setups.
We validate our model with synthetic and real data and integrate it into photometric and feature-based Bundle Adjustment.
arXiv Detail & Related papers (2022-02-01T21:21:56Z) - Scalable mixed-domain Gaussian process modeling and model reduction for longitudinal data [5.00301731167245]
We derive a basis function approximation scheme for mixed-domain covariance functions.
We show that we can approximate the exact GP model accurately in a fraction of the runtime.
We also demonstrate a scalable model reduction workflow for obtaining smaller and more interpretable models.
arXiv Detail & Related papers (2021-11-03T04:47:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.