User-defined Event Sampling and Uncertainty Quantification in Diffusion
Models for Physical Dynamical Systems
- URL: http://arxiv.org/abs/2306.07526v1
- Date: Tue, 13 Jun 2023 03:42:03 GMT
- Title: User-defined Event Sampling and Uncertainty Quantification in Diffusion
Models for Physical Dynamical Systems
- Authors: Marc Finzi, Anudhyan Boral, Andrew Gordon Wilson, Fei Sha, Leonardo
Zepeda-N\'u\~nez
- Abstract summary: We show that diffusion models can be adapted to make predictions and provide uncertainty quantification for chaotic dynamical systems.
We develop a probabilistic approximation scheme for the conditional score function which converges to the true distribution as the noise level decreases.
We are able to sample conditionally on nonlinear userdefined events at inference time, and matches data statistics even when sampling from the tails of the distribution.
- Score: 49.75149094527068
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Diffusion models are a class of probabilistic generative models that have
been widely used as a prior for image processing tasks like text conditional
generation and inpainting. We demonstrate that these models can be adapted to
make predictions and provide uncertainty quantification for chaotic dynamical
systems. In these applications, diffusion models can implicitly represent
knowledge about outliers and extreme events; however, querying that knowledge
through conditional sampling or measuring probabilities is surprisingly
difficult. Existing methods for conditional sampling at inference time seek
mainly to enforce the constraints, which is insufficient to match the
statistics of the distribution or compute the probability of the chosen events.
To achieve these ends, optimally one would use the conditional score function,
but its computation is typically intractable. In this work, we develop a
probabilistic approximation scheme for the conditional score function which
provably converges to the true distribution as the noise level decreases. With
this scheme we are able to sample conditionally on nonlinear userdefined events
at inference time, and matches data statistics even when sampling from the
tails of the distribution.
Related papers
- Theory on Score-Mismatched Diffusion Models and Zero-Shot Conditional Samplers [49.97755400231656]
We present the first performance guarantee with explicit dimensional general score-mismatched diffusion samplers.
We show that score mismatches result in an distributional bias between the target and sampling distributions, proportional to the accumulated mismatch between the target and training distributions.
This result can be directly applied to zero-shot conditional samplers for any conditional model, irrespective of measurement noise.
arXiv Detail & Related papers (2024-10-17T16:42:12Z) - Probabilistic Conformal Prediction with Approximate Conditional Validity [81.30551968980143]
We develop a new method for generating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution.
Our method consistently outperforms existing approaches in terms of conditional coverage.
arXiv Detail & Related papers (2024-07-01T20:44:48Z) - Conditional Pseudo-Reversible Normalizing Flow for Surrogate Modeling in Quantifying Uncertainty Propagation [11.874729463016227]
We introduce a conditional pseudo-reversible normalizing flow for constructing surrogate models of a physical model polluted by additive noise.
The training process utilizes dataset consisting of input-output pairs without requiring prior knowledge about the noise and the function.
Our model, once trained, can generate samples from any conditional probability density functions whose high probability regions are covered by the training set.
arXiv Detail & Related papers (2024-03-31T00:09:58Z) - Probabilistic Forecasting with Stochastic Interpolants and Föllmer Processes [18.344934424278048]
We propose a framework for probabilistic forecasting of dynamical systems based on generative modeling.
We show that the drift and the diffusion coefficients of this SDE can be adjusted after training, and that a specific choice that minimizes the impact of the estimation error gives a F"ollmer process.
arXiv Detail & Related papers (2024-03-20T16:33:06Z) - Unveil Conditional Diffusion Models with Classifier-free Guidance: A Sharp Statistical Theory [87.00653989457834]
Conditional diffusion models serve as the foundation of modern image synthesis and find extensive application in fields like computational biology and reinforcement learning.
Despite the empirical success, theory of conditional diffusion models is largely missing.
This paper bridges the gap by presenting a sharp statistical theory of distribution estimation using conditional diffusion models.
arXiv Detail & Related papers (2024-03-18T17:08:24Z) - SMURF-THP: Score Matching-based UnceRtainty quantiFication for
Transformer Hawkes Process [76.98721879039559]
We propose SMURF-THP, a score-based method for learning Transformer Hawkes process and quantifying prediction uncertainty.
Specifically, SMURF-THP learns the score function of events' arrival time based on a score-matching objective.
We conduct extensive experiments in both event type prediction and uncertainty quantification of arrival time.
arXiv Detail & Related papers (2023-10-25T03:33:45Z) - Debias Coarsely, Sample Conditionally: Statistical Downscaling through
Optimal Transport and Probabilistic Diffusion Models [15.623456909553786]
We introduce a two-stage probabilistic framework for statistical downscaling using unpaired data.
We demonstrate the utility of the proposed approach on one- and two-dimensional fluid flow problems.
arXiv Detail & Related papers (2023-05-24T23:40:23Z) - Bi-Noising Diffusion: Towards Conditional Diffusion Models with
Generative Restoration Priors [64.24948495708337]
We introduce a new method that brings predicted samples to the training data manifold using a pretrained unconditional diffusion model.
We perform comprehensive experiments to demonstrate the effectiveness of our approach on super-resolution, colorization, turbulence removal, and image-deraining tasks.
arXiv Detail & Related papers (2022-12-14T17:26:35Z) - Spectral Representation Learning for Conditional Moment Models [33.34244475589745]
We propose a procedure that automatically learns representations with controlled measures of ill-posedness.
Our method approximates a linear representation defined by the spectral decomposition of a conditional expectation operator.
We show this representation can be efficiently estimated from data, and establish L2 consistency for the resulting estimator.
arXiv Detail & Related papers (2022-10-29T07:48:29Z) - Adversarial sampling of unknown and high-dimensional conditional
distributions [0.0]
In this paper the sampling method, as well as the inference of the underlying distribution, are handled with a data-driven method known as generative adversarial networks (GAN)
GAN trains two competing neural networks to produce a network that can effectively generate samples from the training set distribution.
It is shown that all the versions of the proposed algorithm effectively sample the target conditional distribution with minimal impact on the quality of the samples.
arXiv Detail & Related papers (2021-11-08T12:23:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.