TAUDiff: Improving statistical downscaling for extreme weather events using generative diffusion models
- URL: http://arxiv.org/abs/2412.13627v1
- Date: Wed, 18 Dec 2024 09:05:19 GMT
- Title: TAUDiff: Improving statistical downscaling for extreme weather events using generative diffusion models
- Authors: Rahul Sundar, Nishant Parashar, Antoine Blanchard, Boyko Dodov,
- Abstract summary: It is crucial to achieve rapid turnaround, dynamical consistency, and accurate-temporal recovery.
We propose an efficient diffusion model TiffAUD, that combines a deterministic model mean downscaling with a smaller generative diffusion model for recovering the fine-scale features.
Our approach can not only ensure quicker simulation of extreme events but also reduce overall carbon footprint due to low inference times.
- Score: 0.0
- License:
- Abstract: Deterministic regression-based downscaling models for climate variables often suffer from spectral bias, which can be mitigated by generative models like diffusion models. To enable efficient and reliable simulation of extreme weather events, it is crucial to achieve rapid turnaround, dynamical consistency, and accurate spatio-temporal spectral recovery. We propose an efficient correction diffusion model, TAUDiff, that combines a deterministic spatio-temporal model for mean field downscaling with a smaller generative diffusion model for recovering the fine-scale stochastic features. We demonstrate the efficacy of this approach on downscaling atmospheric wind velocity fields obtained from coarse GCM simulations. Our approach can not only ensure quicker simulation of extreme events but also reduce overall carbon footprint due to low inference times.
Related papers
- A Generative Framework for Probabilistic, Spatiotemporally Coherent Downscaling of Climate Simulation [23.504915709396204]
We present a novel generative framework that uses a score-based diffusion model trained on high-resolution reanalysis data to capture the statistical properties of local weather dynamics.
We demonstrate that the model generates spatially and temporally coherent weather dynamics that align with global climate output.
arXiv Detail & Related papers (2024-12-19T19:47:35Z) - Energy-Based Diffusion Language Models for Text Generation [126.23425882687195]
Energy-based Diffusion Language Model (EDLM) is an energy-based model operating at the full sequence level for each diffusion step.
Our framework offers a 1.3$times$ sampling speedup over existing diffusion models.
arXiv Detail & Related papers (2024-10-28T17:25:56Z) - On conditional diffusion models for PDE simulations [53.01911265639582]
We study score-based diffusion models for forecasting and assimilation of sparse observations.
We propose an autoregressive sampling approach that significantly improves performance in forecasting.
We also propose a new training strategy for conditional score-based models that achieves stable performance over a range of history lengths.
arXiv Detail & Related papers (2024-10-21T18:31:04Z) - Dynamical-generative downscaling of climate model ensembles [13.376226374728917]
We propose a novel approach combining dynamical downscaling with generative artificial intelligence to reduce the cost and improve the uncertainty estimates of downscaled climate projections.
In our framework, an RCM dynamically downscales ESM output to an intermediate resolution, followed by a generative diffusion model that further refines the resolution to the target scale.
arXiv Detail & Related papers (2024-10-02T17:31:01Z) - Conditional diffusion models for downscaling & bias correction of Earth system model precipitation [1.5193424827619018]
We propose a novel machine learning framework for simultaneous bias correction and downscaling.
Our approach ensures statistical fidelity, preserves large-scale spatial patterns and outperforms existing methods.
arXiv Detail & Related papers (2024-04-05T11:01:50Z) - Uncertainty-aware Surrogate Models for Airfoil Flow Simulations with Denoising Diffusion Probabilistic Models [26.178192913986344]
We make a first attempt to use denoising diffusion probabilistic models (DDPMs) to train an uncertainty-aware surrogate model for turbulence simulations.
Our results show DDPMs can successfully capture the whole distribution of solutions and, as a consequence, accurately estimate the uncertainty of the simulations.
We also evaluate an emerging generative modeling variant, flow matching, in comparison to regular diffusion models.
arXiv Detail & Related papers (2023-12-08T19:04:17Z) - Residual Corrective Diffusion Modeling for Km-scale Atmospheric Downscaling [58.456404022536425]
State of the art for physical hazard prediction from weather and climate requires expensive km-scale numerical simulations driven by coarser resolution global inputs.
Here, a generative diffusion architecture is explored for downscaling such global inputs to km-scale, as a cost-effective machine learning alternative.
The model is trained to predict 2km data from a regional weather model over Taiwan, conditioned on a 25km global reanalysis.
arXiv Detail & Related papers (2023-09-24T19:57:22Z) - DiffESM: Conditional Emulation of Earth System Models with Diffusion
Models [2.1989764549743476]
A key application of Earth System Models (ESMs) is studying extreme weather events, such as heat waves or dry spells.
We show that diffusion models can effectively emulate the trends of ESMs under previously unseen climate scenarios.
arXiv Detail & Related papers (2023-04-23T17:12:33Z) - Bi-Noising Diffusion: Towards Conditional Diffusion Models with
Generative Restoration Priors [64.24948495708337]
We introduce a new method that brings predicted samples to the training data manifold using a pretrained unconditional diffusion model.
We perform comprehensive experiments to demonstrate the effectiveness of our approach on super-resolution, colorization, turbulence removal, and image-deraining tasks.
arXiv Detail & Related papers (2022-12-14T17:26:35Z) - Your Autoregressive Generative Model Can be Better If You Treat It as an
Energy-Based One [83.5162421521224]
We propose a unique method termed E-ARM for training autoregressive generative models.
E-ARM takes advantage of a well-designed energy-based learning objective.
We show that E-ARM can be trained efficiently and is capable of alleviating the exposure bias problem.
arXiv Detail & Related papers (2022-06-26T10:58:41Z) - How Much is Enough? A Study on Diffusion Times in Score-based Generative
Models [76.76860707897413]
Current best practice advocates for a large T to ensure that the forward dynamics brings the diffusion sufficiently close to a known and simple noise distribution.
We show how an auxiliary model can be used to bridge the gap between the ideal and the simulated forward dynamics, followed by a standard reverse diffusion process.
arXiv Detail & Related papers (2022-06-10T15:09:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.