Coupled Data and Measurement Space Dynamics for Enhanced Diffusion Posterior Sampling
- URL: http://arxiv.org/abs/2510.09676v1
- Date: Wed, 08 Oct 2025 18:59:16 GMT
- Title: Coupled Data and Measurement Space Dynamics for Enhanced Diffusion Posterior Sampling
- Authors: Shayan Mohajer Hamidi, En-Hui Yang, Ben Liang,
- Abstract summary: Inverse problems, where the goal is to recover an unknown signal from noisy or incomplete measurements, are central to medical imaging, remote sensing, and computational biology.<n>We propose a novel framework called emphcoupled data and space diffusion posterior sampling (C-DPS), which eliminates the need for constraint tuning or likelihood measurement.<n>C-DPS consistently outperforms existing baselines, both qualitatively and quantitatively, across multiple inverse problem benchmarks.
- Score: 27.146380722473932
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Inverse problems, where the goal is to recover an unknown signal from noisy or incomplete measurements, are central to applications in medical imaging, remote sensing, and computational biology. Diffusion models have recently emerged as powerful priors for solving such problems. However, existing methods either rely on projection-based techniques that enforce measurement consistency through heuristic updates, or they approximate the likelihood $p(\boldsymbol{y} \mid \boldsymbol{x})$, often resulting in artifacts and instability under complex or high-noise conditions. To address these limitations, we propose a novel framework called \emph{coupled data and measurement space diffusion posterior sampling} (C-DPS), which eliminates the need for constraint tuning or likelihood approximation. C-DPS introduces a forward stochastic process in the measurement space $\{\boldsymbol{y}_t\}$, evolving in parallel with the data-space diffusion $\{\boldsymbol{x}_t\}$, which enables the derivation of a closed-form posterior $p(\boldsymbol{x}_{t-1} \mid \boldsymbol{x}_t, \boldsymbol{y}_{t-1})$. This coupling allows for accurate and recursive sampling based on a well-defined posterior distribution. Empirical results demonstrate that C-DPS consistently outperforms existing baselines, both qualitatively and quantitatively, across multiple inverse problem benchmarks.
Related papers
- Efficient Sampling with Discrete Diffusion Models: Sharp and Adaptive Guarantees [9.180350432640912]
We study the sampling efficiency of score-based discrete diffusion models under a continuous-time Markov chain (CTMC) formulation.<n>For uniform discrete diffusion, we show that the $$-leaping algorithm achieves an complexity of order $tilde O(d/varepsilon)$.<n>For masking discrete diffusion, we introduce a modified $$-leaping sampler whose convergence rate is governed by an intrinsic information-theoretic quantity.
arXiv Detail & Related papers (2026-02-16T18:48:17Z) - Phase-space entropy at acquisition reflects downstream learnability [54.4100065023873]
We propose an acquisition-level scalar $S_mathcal B$ based on instrument-resolved phase space.<n>We show theoretically that (S_mathcal B) correctly identifies the phase-space coherence of periodic sampling.<n>$|S_mathcal B|$ consistently ranks sampling geometries and predicts downstream reconstruction/recognition difficulty emphwithout training.
arXiv Detail & Related papers (2025-12-22T10:03:51Z) - Injecting Measurement Information Yields a Fast and Noise-Robust Diffusion-Based Inverse Problem Solver [26.516117473433795]
We propose to estimate the conditional posterior mean $mathbbE [mathbfx_t, mathbfy]$.<n>The resulting prediction can be integrated into any standard sampler, resulting in a fast and memory-efficient inverse solver.
arXiv Detail & Related papers (2025-08-05T00:01:41Z) - Almost Linear Convergence under Minimal Score Assumptions: Quantized Transition Diffusion [25.542593757387095]
We propose Quantized Transition Diffusion (QTD), a novel approach that integrates data quantization with discrete diffusion dynamics.<n>Our method first transforms the continuous data distribution $p_*$ into a discrete one $q_*$ via histogram approximation and binary encoding.<n>For reverse-time sampling, we introduce a textittruncated uniformization technique to simulate the reverse CTMC.
arXiv Detail & Related papers (2025-05-28T02:10:11Z) - The Spacetime of Diffusion Models: An Information Geometry Perspective [40.23096112113255]
We show that the standard pullback approach, utilizing the deterministic probability flow ComplementODE decoder, is fundamentally flawed.<n>We introduce a latent spacetime $z=(x_t,t)$ that indexes the family of denoising distributions $p(x_t,t)$ across all noise scales.<n>The resulting structure induces a principled Diffusion Distance Edit, where geodesics trace minimal sequences of noise and denoise edits between data.
arXiv Detail & Related papers (2025-05-23T06:16:58Z) - Conditional Mutual Information Based Diffusion Posterior Sampling for Solving Inverse Problems [3.866047645663101]
In computer vision, tasks such as inpainting, deblurring, and super-resolution are commonly formulated as inverse problems.<n>Recently, diffusion models (DMs) have emerged as a promising approach for addressing noisy linear inverse problems.<n>We propose an information-theoretic approach to improve the effectiveness of DMs in solving inverse problems.
arXiv Detail & Related papers (2025-01-06T09:45:26Z) - Active Diffusion Subsampling [15.028061496012924]
In maximum entropy sampling, one selects measurement locations that are expected to have the highest entropy, so as to minimize uncertainty about $x$.<n>Recently, diffusion models have been shown to produce high-quality posterior samples of high-dimensional signals using guided diffusion.<n>We propose Active Diffusion Subsampling (ADS), a method for designing intelligent subsampling masks using guided diffusion.
arXiv Detail & Related papers (2024-06-20T15:05:06Z) - Amortizing intractable inference in diffusion models for vision, language, and control [89.65631572949702]
This paper studies amortized sampling of the posterior over data, $mathbfxsim prm post(mathbfx)propto p(mathbfx)r(mathbfx)$, in a model that consists of a diffusion generative model prior $p(mathbfx)$ and a black-box constraint or function $r(mathbfx)$.<n>We prove the correctness of a data-free learning objective, relative trajectory balance, for training a diffusion model that samples from
arXiv Detail & Related papers (2024-05-31T16:18:46Z) - Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - Towards Faster Non-Asymptotic Convergence for Diffusion-Based Generative
Models [49.81937966106691]
We develop a suite of non-asymptotic theory towards understanding the data generation process of diffusion models.
In contrast to prior works, our theory is developed based on an elementary yet versatile non-asymptotic approach.
arXiv Detail & Related papers (2023-06-15T16:30:08Z) - Diffusion Models for Causal Discovery via Topological Ordering [20.875222263955045]
emphTopological ordering approaches reduce the optimisation space of causal discovery by searching over a permutation rather than graph space.
For ANMs, the emphHessian of the data log-likelihood can be used for finding leaf nodes in a causal graph, allowing its topological ordering.
We introduce theory for updating the learned Hessian without re-training the neural network, and we show that computing with a subset of samples gives an accurate approximation of the ordering.
arXiv Detail & Related papers (2022-10-12T13:36:29Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - Towards Sample-Optimal Compressive Phase Retrieval with Sparse and
Generative Priors [59.33977545294148]
We show that $O(k log L)$ samples suffice to guarantee that the signal is close to any vector that minimizes an amplitude-based empirical loss function.
We adapt this result to sparse phase retrieval, and show that $O(s log n)$ samples are sufficient for a similar guarantee when the underlying signal is $s$-sparse and $n$-dimensional.
arXiv Detail & Related papers (2021-06-29T12:49:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.