Metropolis Sampling for Constrained Diffusion Models
- URL: http://arxiv.org/abs/2307.05439v2
- Date: Thu, 9 Nov 2023 16:58:21 GMT
- Title: Metropolis Sampling for Constrained Diffusion Models
- Authors: Nic Fishman, Leo Klarner, Emile Mathieu, Michael Hutchinson, Valentin
de Bortoli
- Abstract summary: Denoising diffusion models have recently emerged as the predominant paradigm for generative modelling on image domains.
We introduce an alternative, simple noretisation scheme based on the reflected Brownian motion.
- Score: 11.488860260925504
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Denoising diffusion models have recently emerged as the predominant paradigm
for generative modelling on image domains. In addition, their extension to
Riemannian manifolds has facilitated a range of applications across the natural
sciences. While many of these problems stand to benefit from the ability to
specify arbitrary, domain-informed constraints, this setting is not covered by
the existing (Riemannian) diffusion model methodology. Recent work has
attempted to address this issue by constructing novel noising processes based
on the reflected Brownian motion and logarithmic barrier methods. However, the
associated samplers are either computationally burdensome or only apply to
convex subsets of Euclidean space. In this paper, we introduce an alternative,
simple noising scheme based on Metropolis sampling that affords substantial
gains in computational efficiency and empirical performance compared to the
earlier samplers. Of independent interest, we prove that this new process
corresponds to a valid discretisation of the reflected Brownian motion. We
demonstrate the scalability and flexibility of our approach on a range of
problem settings with convex and non-convex constraints, including applications
from geospatial modelling, robotics and protein design.
Related papers
- A Diffusion Model Framework for Unsupervised Neural Combinatorial Optimization [7.378582040635655]
Current deep learning approaches rely on generative models that yield exact sample likelihoods.
This work introduces a method that lifts this restriction and opens the possibility to employ highly expressive latent variable models.
We experimentally validate our approach in data-free Combinatorial Optimization and demonstrate that our method achieves a new state-of-the-art on a wide range of benchmark problems.
arXiv Detail & Related papers (2024-06-03T17:55:02Z) - Reflected Schr\"odinger Bridge for Constrained Generative Modeling [16.72888494254555]
Reflected diffusion models have become the go-to method for large-scale generative models in real-world applications.
We introduce the Reflected Schrodinger Bridge algorithm: an entropy-regularized optimal transport approach tailored generating data within diverse bounded domains.
Our algorithm yields robust generative modeling in diverse domains, and its scalability is demonstrated in real-world constrained generative modeling through standard image benchmarks.
arXiv Detail & Related papers (2024-01-06T14:39:58Z) - Multi-Response Heteroscedastic Gaussian Process Models and Their
Inference [1.52292571922932]
We propose a novel framework for the modeling of heteroscedastic covariance functions.
We employ variational inference to approximate the posterior and facilitate posterior predictive modeling.
We show that our proposed framework offers a robust and versatile tool for a wide array of applications.
arXiv Detail & Related papers (2023-08-29T15:06:47Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - A Variational Perspective on Solving Inverse Problems with Diffusion
Models [101.831766524264]
Inverse tasks can be formulated as inferring a posterior distribution over data.
This is however challenging in diffusion models since the nonlinear and iterative nature of the diffusion process renders the posterior intractable.
We propose a variational approach that by design seeks to approximate the true posterior distribution.
arXiv Detail & Related papers (2023-05-07T23:00:47Z) - Diffusion Models for Constrained Domains [11.488860260925504]
We present two distinct noising processes based on (i) the logarithmic barrier metric and (ii) the reflected Brownian motion induced by the constraints.
We then demonstrate the practical utility of our methods on a number of synthetic and real-world tasks, including applications from robotics and protein design.
arXiv Detail & Related papers (2023-04-11T17:19:45Z) - Flow-based sampling in the lattice Schwinger model at criticality [54.48885403692739]
Flow-based algorithms may provide efficient sampling of field distributions for lattice field theory applications.
We provide a numerical demonstration of robust flow-based sampling in the Schwinger model at the critical value of the fermion mass.
arXiv Detail & Related papers (2022-02-23T19:00:00Z) - Riemannian Score-Based Generative Modeling [56.20669989459281]
We introduce score-based generative models (SGMs) demonstrating remarkable empirical performance.
Current SGMs make the underlying assumption that the data is supported on a Euclidean manifold with flat geometry.
This prevents the use of these models for applications in robotics, geoscience or protein modeling.
arXiv Detail & Related papers (2022-02-06T11:57:39Z) - A Twin Neural Model for Uplift [59.38563723706796]
Uplift is a particular case of conditional treatment effect modeling.
We propose a new loss function defined by leveraging a connection with the Bayesian interpretation of the relative risk.
We show our proposed method is competitive with the state-of-the-art in simulation setting and on real data from large scale randomized experiments.
arXiv Detail & Related papers (2021-05-11T16:02:39Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.