Stochastic Optimal Control for Diffusion Bridges in Function Spaces
- URL: http://arxiv.org/abs/2405.20630v4
- Date: Wed, 30 Oct 2024 05:21:00 GMT
- Title: Stochastic Optimal Control for Diffusion Bridges in Function Spaces
- Authors: Byoungwoo Park, Jungwon Choi, Sungbin Lim, Juho Lee,
- Abstract summary: We present a theory of optimal control tailored to infinite-dimensional spaces.
We show how Doob's $h$-transform can be derived from the SOC perspective and expanded to infinite dimensions.
We propose two applications: learning bridges between two infinite-dimensional distributions and generative models for sampling from an infinite-dimensional distribution.
- Score: 13.544676987441441
- License:
- Abstract: Recent advancements in diffusion models and diffusion bridges primarily focus on finite-dimensional spaces, yet many real-world problems necessitate operations in infinite-dimensional function spaces for more natural and interpretable formulations. In this paper, we present a theory of stochastic optimal control (SOC) tailored to infinite-dimensional spaces, aiming to extend diffusion-based algorithms to function spaces. Specifically, we demonstrate how Doob's $h$-transform, the fundamental tool for constructing diffusion bridges, can be derived from the SOC perspective and expanded to infinite dimensions. This expansion presents a challenge, as infinite-dimensional spaces typically lack closed-form densities. Leveraging our theory, we establish that solving the optimal control problem with a specific objective function choice is equivalent to learning diffusion-based generative models. We propose two applications: (1) learning bridges between two infinite-dimensional distributions and (2) generative models for sampling from an infinite-dimensional distribution. Our approach proves effective for diverse problems involving continuous function space representations, such as resolution-free images, time-series data, and probability density functions.
Related papers
- G2D2: Gradient-guided Discrete Diffusion for image inverse problem solving [55.185588994883226]
This paper presents a novel method for addressing linear inverse problems by leveraging image-generation models based on discrete diffusion as priors.
To the best of our knowledge, this is the first approach to use discrete diffusion model-based priors for solving image inverse problems.
arXiv Detail & Related papers (2024-10-09T06:18:25Z) - DiffSG: A Generative Solver for Network Optimization with Diffusion Model [75.27274046562806]
Diffusion generative models can consider a broader range of solutions and exhibit stronger generalization by learning parameters.
We propose a new framework, which leverages intrinsic distribution learning of diffusion generative models to learn high-quality solutions.
arXiv Detail & Related papers (2024-08-13T07:56:21Z) - Simulating infinite-dimensional nonlinear diffusion bridges [1.747623282473278]
The diffusion bridge is a type of diffusion process that conditions on hitting a specific state within a finite time period.
We present a solution by merging score-matching techniques with operator learning, enabling a direct approach to score-matching for the infinite-dimensional bridge.
arXiv Detail & Related papers (2024-05-28T16:52:52Z) - Functional Diffusion [55.251174506648454]
We propose a new class of generative diffusion models, called functional diffusion.
functional diffusion can be seen as an extension of classical diffusion models to an infinite-dimensional domain.
We show generative results on complicated signed distance functions and deformation functions defined on 3D surfaces.
arXiv Detail & Related papers (2023-11-26T21:35:34Z) - Scaling Riemannian Diffusion Models [68.52820280448991]
We show that our method enables us to scale to high dimensional tasks on nontrivial manifold.
We model QCD densities on $SU(n)$ lattices and contrastively learned embeddings on high dimensional hyperspheres.
arXiv Detail & Related papers (2023-10-30T21:27:53Z) - Infinite-Dimensional Diffusion Models [4.342241136871849]
We formulate diffusion-based generative models in infinite dimensions and apply them to the generative modeling of functions.
We show that our formulations are well posed in the infinite-dimensional setting and provide dimension-independent distance bounds from the sample to the target measure.
We also develop guidelines for the design of infinite-dimensional diffusion models.
arXiv Detail & Related papers (2023-02-20T18:00:38Z) - Neural Set Function Extensions: Learning with Discrete Functions in High
Dimensions [63.21838830509772]
We develop a framework for extending set functions onto low-dimensional continuous domains.
Our framework subsumes many well-known extensions as special cases.
We convert low-dimensional neural network bottlenecks into representations in high-dimensional spaces.
arXiv Detail & Related papers (2022-08-08T10:58:02Z) - Lifting the Convex Conjugate in Lagrangian Relaxations: A Tractable
Approach for Continuous Markov Random Fields [53.31927549039624]
We show that a piecewise discretization preserves better contrast from existing discretization problems.
We apply this theory to the problem of matching two images.
arXiv Detail & Related papers (2021-07-13T12:31:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.