Supervised Guidance Training for Infinite-Dimensional Diffusion Models
- URL: http://arxiv.org/abs/2601.20756v1
- Date: Wed, 28 Jan 2026 16:39:39 GMT
- Title: Supervised Guidance Training for Infinite-Dimensional Diffusion Models
- Authors: Elizabeth L. Baker, Alexander Denker, Jes Frellsen,
- Abstract summary: In inverse problems, the aim is to sample from a posterior distribution over functions obtained by conditioning a prior.<n>We prove that the models can be conditioned using an infinite-dimensional extension of Doob's $h$-transform.<n>We propose a simulation-free score matching objective (called Supervised Guidance Training) enabling efficient and stable posterior sampling.
- Score: 47.65586147952857
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Score-based diffusion models have recently been extended to infinite-dimensional function spaces, with uses such as inverse problems arising from partial differential equations. In the Bayesian formulation of inverse problems, the aim is to sample from a posterior distribution over functions obtained by conditioning a prior on noisy observations. While diffusion models provide expressive priors in function space, the theory of conditioning them to sample from the posterior remains open. We address this, assuming that either the prior lies in the Cameron-Martin space, or is absolutely continuous with respect to a Gaussian measure. We prove that the models can be conditioned using an infinite-dimensional extension of Doob's $h$-transform, and that the conditional score decomposes into an unconditional score and a guidance term. As the guidance term is intractable, we propose a simulation-free score matching objective (called Supervised Guidance Training) enabling efficient and stable posterior sampling. We illustrate the theory with numerical examples on Bayesian inverse problems in function spaces. In summary, our work offers the first function-space method for fine-tuning trained diffusion models to accurately sample from a posterior.
Related papers
- Provable Diffusion Posterior Sampling for Bayesian Inversion [13.807494493914335]
This paper proposes a novel diffusion-based posterior sampling method within a plug-and-play framework.<n>To approximate the posterior score, we develop a Monte Carlo estimator in which particles are generated using Langevin dynamics.<n>On the theoretical side, we provide non-asymptotic error bounds, showing that the method converges even for complex multi-modal target posterior.
arXiv Detail & Related papers (2025-12-08T20:34:05Z) - Guided Diffusion Sampling on Function Spaces with Applications to PDEs [112.09025802445329]
We propose a general framework for conditional sampling in PDE-based inverse problems.<n>This is accomplished by a function-space diffusion model and plug-and-play guidance for conditioning.<n>Our method achieves an average 32% accuracy improvement over state-of-the-art fixed-resolution diffusion baselines.
arXiv Detail & Related papers (2025-05-22T17:58:12Z) - A Mixture-Based Framework for Guiding Diffusion Models [19.83064246586143]
Denoising diffusion models have driven significant progress in the field of Bayesian inverse problems.<n>Recent approaches use pre-trained diffusion models as priors to solve a wide range of such problems.<n>This work proposes a novel mixture approximation of these intermediate distributions.
arXiv Detail & Related papers (2025-02-05T16:26:06Z) - Bayesian Circular Regression with von Mises Quasi-Processes [57.88921637944379]
In this work we explore a family of expressive and interpretable distributions over circle-valued random functions.<n>For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Gibbs sampling.<n>We present experiments applying this model to the prediction of wind directions and the percentage of the running gait cycle as a function of joint angles.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Amortizing intractable inference in diffusion models for vision, language, and control [89.65631572949702]
This paper studies amortized sampling of the posterior over data, $mathbfxsim prm post(mathbfx)propto p(mathbfx)r(mathbfx)$, in a model that consists of a diffusion generative model prior $p(mathbfx)$ and a black-box constraint or function $r(mathbfx)$.<n>We prove the correctness of a data-free learning objective, relative trajectory balance, for training a diffusion model that samples from
arXiv Detail & Related papers (2024-05-31T16:18:46Z) - An Unconditional Representation of the Conditional Score in Infinite-Dimensional Linear Inverse Problems [5.340736751238338]
We propose an unconditional representation of the conditional score-function tailored to linear inverse problems.<n>We show that the conditional score can be derived exactly from a trained (unconditional) score using affine transformations.<n>Our approach is formulated in infinite-dimensional function spaces, making it inherently discretization-invariant.
arXiv Detail & Related papers (2024-05-24T15:33:27Z) - Improving Diffusion Models for Inverse Problems Using Optimal Posterior Covariance [52.093434664236014]
Recent diffusion models provide a promising zero-shot solution to noisy linear inverse problems without retraining for specific inverse problems.
Inspired by this finding, we propose to improve recent methods by using more principled covariance determined by maximum likelihood estimation.
arXiv Detail & Related papers (2024-02-03T13:35:39Z) - Conditional score-based diffusion models for Bayesian inference in
infinite dimensions [4.747324197963405]
We propose a theoretically grounded method for sampling from the posterior of infinite-dimensional inverse problems based on amortized conditional SDMs.
A significant part of our analysis is dedicated to demonstrating that extending infinite-dimensional SDMs to the conditional setting requires careful consideration.
arXiv Detail & Related papers (2023-05-28T15:34:15Z) - Score-based Diffusion Models in Function Space [137.70916238028306]
Diffusion models have recently emerged as a powerful framework for generative modeling.<n>This work introduces a mathematically rigorous framework called Denoising Diffusion Operators (DDOs) for training diffusion models in function space.<n>We show that the corresponding discretized algorithm generates accurate samples at a fixed cost independent of the data resolution.
arXiv Detail & Related papers (2023-02-14T23:50:53Z) - Score-based Continuous-time Discrete Diffusion Models [102.65769839899315]
We extend diffusion models to discrete variables by introducing a Markov jump process where the reverse process denoises via a continuous-time Markov chain.
We show that an unbiased estimator can be obtained via simple matching the conditional marginal distributions.
We demonstrate the effectiveness of the proposed method on a set of synthetic and real-world music and image benchmarks.
arXiv Detail & Related papers (2022-11-30T05:33:29Z) - Understanding Variational Inference in Function-Space [20.940162027560408]
We highlight some advantages and limitations of employing the Kullback-Leibler divergence in this setting.
We propose (featurized) Bayesian linear regression as a benchmark for function-space' inference methods that directly measures approximation quality.
arXiv Detail & Related papers (2020-11-18T17:42:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.