Diffusion-HMC: Parameter Inference with Diffusion Model driven Hamiltonian Monte Carlo
- URL: http://arxiv.org/abs/2405.05255v1
- Date: Wed, 8 May 2024 17:59:03 GMT
- Title: Diffusion-HMC: Parameter Inference with Diffusion Model driven Hamiltonian Monte Carlo
- Authors: Nayantara Mudur, Carolina Cuesta-Lazaro, Douglas P. Finkbeiner,
- Abstract summary: This work uses a single diffusion generative model to address the interlinked objectives of generating predictions for observed astrophysical fields from theory and constraining physical models from observations using these predictions.
We leverage the approximate likelihood of the diffusion generative model to derive tight constraints on cosmology by using the Hamiltonian Monte Carlo method to sample the posterior on cosmological parameters for a given test image.
- Score: 2.048226951354646
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Diffusion generative models have excelled at diverse image generation and reconstruction tasks across fields. A less explored avenue is their application to discriminative tasks involving regression or classification problems. The cornerstone of modern cosmology is the ability to generate predictions for observed astrophysical fields from theory and constrain physical models from observations using these predictions. This work uses a single diffusion generative model to address these interlinked objectives -- as a surrogate model or emulator for cold dark matter density fields conditional on input cosmological parameters, and as a parameter inference model that solves the inverse problem of constraining the cosmological parameters of an input field. The model is able to emulate fields with summary statistics consistent with those of the simulated target distribution. We then leverage the approximate likelihood of the diffusion generative model to derive tight constraints on cosmology by using the Hamiltonian Monte Carlo method to sample the posterior on cosmological parameters for a given test image. Finally, we demonstrate that this parameter inference approach is more robust to the addition of noise than baseline parameter inference networks.
Related papers
- Latent diffusion models for parameterization and data assimilation of facies-based geomodels [0.0]
Diffusion models are trained to generate new geological realizations from input fields characterized by random noise.
Latent diffusion models are shown to provide realizations that are visually consistent with samples from geomodeling software.
arXiv Detail & Related papers (2024-06-21T01:32:03Z) - Conditional score-based diffusion models for solving inverse problems in mechanics [6.319616423658121]
We propose a framework to perform Bayesian inference using conditional score-based diffusion models.
Conditional score-based diffusion models are generative models that learn to approximate the score function of a conditional distribution.
We demonstrate the efficacy of the proposed approach on a suite of high-dimensional inverse problems in mechanics.
arXiv Detail & Related papers (2024-06-19T02:09:15Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - Unveil Conditional Diffusion Models with Classifier-free Guidance: A Sharp Statistical Theory [87.00653989457834]
Conditional diffusion models serve as the foundation of modern image synthesis and find extensive application in fields like computational biology and reinforcement learning.
Despite the empirical success, theory of conditional diffusion models is largely missing.
This paper bridges the gap by presenting a sharp statistical theory of distribution estimation using conditional diffusion models.
arXiv Detail & Related papers (2024-03-18T17:08:24Z) - Cosmological Field Emulation and Parameter Inference with Diffusion
Models [2.3020018305241337]
We leverage diffusion generative models to address two tasks of importance to cosmology.
We show that the model is able to generate fields with power spectra consistent with those of the simulated target distribution.
We additionally explore their utility as parameter inference models and find that we can obtain tight constraints on cosmological parameters.
arXiv Detail & Related papers (2023-12-12T18:58:42Z) - Metropolis Sampling for Constrained Diffusion Models [11.488860260925504]
Denoising diffusion models have recently emerged as the predominant paradigm for generative modelling on image domains.
We introduce an alternative, simple noretisation scheme based on the reflected Brownian motion.
arXiv Detail & Related papers (2023-07-11T17:05:23Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Flow-based sampling in the lattice Schwinger model at criticality [54.48885403692739]
Flow-based algorithms may provide efficient sampling of field distributions for lattice field theory applications.
We provide a numerical demonstration of robust flow-based sampling in the Schwinger model at the critical value of the fermion mass.
arXiv Detail & Related papers (2022-02-23T19:00:00Z) - Learning to discover: expressive Gaussian mixture models for
multi-dimensional simulation and parameter inference in the physical sciences [0.0]
We show that density models describing multiple observables may be created using an auto-regressive Gaussian mixture model.
The model is designed to capture how observable spectra are deformed by hypothesis variations.
It may be used as a statistical model for scientific discovery in interpreting experimental observations.
arXiv Detail & Related papers (2021-08-25T21:27:29Z) - Leveraging Global Parameters for Flow-based Neural Posterior Estimation [90.21090932619695]
Inferring the parameters of a model based on experimental observations is central to the scientific method.
A particularly challenging setting is when the model is strongly indeterminate, i.e., when distinct sets of parameters yield identical observations.
We present a method for cracking such indeterminacy by exploiting additional information conveyed by an auxiliary set of observations sharing global parameters.
arXiv Detail & Related papers (2021-02-12T12:23:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.