Fast, Scale-Adaptive, and Uncertainty-Aware Downscaling of Earth System
Model Fields with Generative Foundation Models
- URL: http://arxiv.org/abs/2403.02774v1
- Date: Tue, 5 Mar 2024 08:41:41 GMT
- Title: Fast, Scale-Adaptive, and Uncertainty-Aware Downscaling of Earth System
Model Fields with Generative Foundation Models
- Authors: Philipp Hess, Michael Aich, Baoxiang Pan, and Niklas Boers
- Abstract summary: We develop a consistency model (CM) that efficiently and accurately downscales arbitrary Earth system model (ESM) simulations without retraining in a zero-shot manner.
We show that the CM outperforms state-of-the-art diffusion models at a fraction of computational cost while maintaining high controllability on the downscaling task.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Accurate and high-resolution Earth system model (ESM) simulations are
essential to assess the ecological and socio-economic impacts of anthropogenic
climate change, but are computationally too expensive. Recent machine learning
approaches have shown promising results in downscaling ESM simulations,
outperforming state-of-the-art statistical approaches. However, existing
methods require computationally costly retraining for each ESM and extrapolate
poorly to climates unseen during training. We address these shortcomings by
learning a consistency model (CM) that efficiently and accurately downscales
arbitrary ESM simulations without retraining in a zero-shot manner. Our
foundation model approach yields probabilistic downscaled fields at resolution
only limited by the observational reference data. We show that the CM
outperforms state-of-the-art diffusion models at a fraction of computational
cost while maintaining high controllability on the downscaling task. Further,
our method generalizes to climate states unseen during training without
explicitly formulated physical constraints.
Related papers
- On conditional diffusion models for PDE simulations [53.01911265639582]
We study score-based diffusion models for forecasting and assimilation of sparse observations.
We propose an autoregressive sampling approach that significantly improves performance in forecasting.
We also propose a new training strategy for conditional score-based models that achieves stable performance over a range of history lengths.
arXiv Detail & Related papers (2024-10-21T18:31:04Z) - Dynamical-generative downscaling of climate model ensembles [13.376226374728917]
We propose a novel approach combining dynamical downscaling with generative artificial intelligence to reduce the cost and improve the uncertainty estimates of downscaled climate projections.
In our framework, an RCM dynamically downscales ESM output to an intermediate resolution, followed by a generative diffusion model that further refines the resolution to the target scale.
arXiv Detail & Related papers (2024-10-02T17:31:01Z) - Efficient Localized Adaptation of Neural Weather Forecasting: A Case Study in the MENA Region [62.09891513612252]
We focus on limited-area modeling and train our model specifically for localized region-level downstream tasks.
We consider the MENA region due to its unique climatic challenges, where accurate localized weather forecasting is crucial for managing water resources, agriculture and mitigating the impacts of extreme weather events.
Our study aims to validate the effectiveness of integrating parameter-efficient fine-tuning (PEFT) methodologies, specifically Low-Rank Adaptation (LoRA) and its variants, to enhance forecast accuracy, as well as training speed, computational resource utilization, and memory efficiency in weather and climate modeling for specific regions.
arXiv Detail & Related papers (2024-09-11T19:31:56Z) - Towards Causal Representations of Climate Model Data [18.82507552857727]
This work delves into the potential of causal representation learning, specifically the emphCausal Discovery with Single-parent Decoding (CDSD) method.
Our findings shed light on the challenges, limitations, and promise of using CDSD as a stepping stone towards more interpretable and robust climate model emulation.
arXiv Detail & Related papers (2023-12-05T16:13:34Z) - DiffESM: Conditional Emulation of Earth System Models with Diffusion
Models [2.1989764549743476]
A key application of Earth System Models (ESMs) is studying extreme weather events, such as heat waves or dry spells.
We show that diffusion models can effectively emulate the trends of ESMs under previously unseen climate scenarios.
arXiv Detail & Related papers (2023-04-23T17:12:33Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Physically Constrained Generative Adversarial Networks for Improving
Precipitation Fields from Earth System Models [0.0]
Existing post-processing methods can improve ESM simulations locally, but cannot correct errors in modelled spatial patterns.
We propose a framework based on physically constrained generative adversarial networks (GANs) to improve local distributions and spatial structure simultaneously.
arXiv Detail & Related papers (2022-08-25T15:19:10Z) - Sample-Efficient Reinforcement Learning via Conservative Model-Based
Actor-Critic [67.00475077281212]
Model-based reinforcement learning algorithms are more sample efficient than their model-free counterparts.
We propose a novel approach that achieves high sample efficiency without the strong reliance on accurate learned models.
We show that CMBAC significantly outperforms state-of-the-art approaches in terms of sample efficiency on several challenging tasks.
arXiv Detail & Related papers (2021-12-16T15:33:11Z) - Reinforcement Learning for Adaptive Mesh Refinement [63.7867809197671]
We propose a novel formulation of AMR as a Markov decision process and apply deep reinforcement learning to train refinement policies directly from simulation.
The model sizes of these policy architectures are independent of the mesh size and hence scale to arbitrarily large and complex simulations.
arXiv Detail & Related papers (2021-03-01T22:55:48Z) - Model-based Policy Optimization with Unsupervised Model Adaptation [37.09948645461043]
We investigate how to bridge the gap between real and simulated data due to inaccurate model estimation for better policy optimization.
We propose a novel model-based reinforcement learning framework AMPO, which introduces unsupervised model adaptation.
Our approach achieves state-of-the-art performance in terms of sample efficiency on a range of continuous control benchmark tasks.
arXiv Detail & Related papers (2020-10-19T14:19:42Z) - No MCMC for me: Amortized sampling for fast and stable training of
energy-based models [62.1234885852552]
Energy-Based Models (EBMs) present a flexible and appealing way to represent uncertainty.
We present a simple method for training EBMs at scale using an entropy-regularized generator to amortize the MCMC sampling.
Next, we apply our estimator to the recently proposed Joint Energy Model (JEM), where we match the original performance with faster and stable training.
arXiv Detail & Related papers (2020-10-08T19:17:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.