On the Shape of Latent Variables in a Denoising VAE-MoG: A Posterior Sampling-Based Study
- URL: http://arxiv.org/abs/2509.25382v1
- Date: Mon, 29 Sep 2025 18:33:09 GMT
- Title: On the Shape of Latent Variables in a Denoising VAE-MoG: A Posterior Sampling-Based Study
- Authors: Fernanda Zapata Bascuñán,
- Abstract summary: We explore the latent space of a denoising variational autoencoder with a mixture-of-Gaussians prior (VAE-MoG)<n>To evaluate how well the model captures the underlying structure, we use Hamiltonian Monte Carlo (HMC) to draw posterior samples conditioned on clean inputs, and compare them to the encoder's outputs from noisy data.<n>Although the model reconstructs signals accurately, statistical comparisons reveal a clear mismatch in the latent space.
- Score: 51.56484100374058
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: In this work, we explore the latent space of a denoising variational autoencoder with a mixture-of-Gaussians prior (VAE-MoG), trained on gravitational wave data from event GW150914. To evaluate how well the model captures the underlying structure, we use Hamiltonian Monte Carlo (HMC) to draw posterior samples conditioned on clean inputs, and compare them to the encoder's outputs from noisy data. Although the model reconstructs signals accurately, statistical comparisons reveal a clear mismatch in the latent space. This shows that strong denoising performance doesn't necessarily mean the latent representations are reliable highlighting the importance of using posterior-based validation when evaluating generative models.
Related papers
- Noise Conditional Variational Score Distillation [60.38982038894823]
Noise Conditional Variational Score Distillation (NCVSD) is a novel method for distilling pretrained diffusion models into generative denoisers.<n>By integrating this insight into the Variational Score Distillation framework, we enable scalable learning of generative denoisers.
arXiv Detail & Related papers (2025-06-11T06:01:39Z) - Spatial Reasoning with Denoising Models [49.83744014336816]
We introduce a framework to perform reasoning over sets of continuous variables via denoising generative models.<n>For the first time, that order of generation can successfully be predicted by the denoising network itself.<n>Using these findings, we can increase the accuracy of specific reasoning tasks from 1% to >50%.
arXiv Detail & Related papers (2025-02-28T14:08:30Z) - Posterior Sampling with Denoising Oracles via Tilted Transport [37.14320147233444]
We introduce the textittilted transport technique, which leverages the quadratic structure of the log-likelihood in linear inverse problems.
We quantify the conditions under which this boosted posterior is strongly log-concave, highlighting the dependencies on the condition number of the measurement matrix.
The resulting posterior sampling scheme is shown to reach the computational threshold predicted for sampling Ising models.
arXiv Detail & Related papers (2024-06-30T16:11:42Z) - Interpreting and Improving Diffusion Models from an Optimization Perspective [4.5993996573872185]
We use this observation to interpret denoising diffusion models as approximate gradient descent applied to the Euclidean distance function.
We propose a new gradient-estimation sampler, generalizing DDIM using insights from our theoretical results.
arXiv Detail & Related papers (2023-06-08T00:56:33Z) - To smooth a cloud or to pin it down: Guarantees and Insights on Score Matching in Denoising Diffusion Models [20.315727650065007]
Denoising diffusion models are a class of generative models which have recently achieved state-of-the-art results across many domains.
We leverage known connections to control akin to the F"ollmer drift to extend established neural network approximation results for the F"ollmer drift to denoising diffusion models and samplers.
arXiv Detail & Related papers (2023-05-16T16:56:19Z) - ChiroDiff: Modelling chirographic data with Diffusion Models [132.5223191478268]
We introduce a powerful model-class namely "Denoising Diffusion Probabilistic Models" or DDPMs for chirographic data.
Our model named "ChiroDiff", being non-autoregressive, learns to capture holistic concepts and therefore remains resilient to higher temporal sampling rate.
arXiv Detail & Related papers (2023-04-07T15:17:48Z) - Posterior samples of source galaxies in strong gravitational lenses with
score-based priors [107.52670032376555]
We use a score-based model to encode the prior for the inference of undistorted images of background galaxies.
We show how the balance between the likelihood and the prior meet our expectations in an experiment with out-of-distribution data.
arXiv Detail & Related papers (2022-11-07T19:00:42Z) - From Denoising Diffusions to Denoising Markov Models [38.33676858989955]
Denoising diffusions are state-of-the-art generative models exhibiting remarkable empirical performance.
We propose a unifying framework generalising this approach to a wide class of spaces and leading to an original extension of score matching.
arXiv Detail & Related papers (2022-11-07T14:34:27Z) - PriorGrad: Improving Conditional Denoising Diffusion Models with
Data-Driven Adaptive Prior [103.00403682863427]
We propose PriorGrad to improve the efficiency of the conditional diffusion model.
We show that PriorGrad achieves a faster convergence leading to data and parameter efficiency and improved quality.
arXiv Detail & Related papers (2021-06-11T14:04:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.