Multi-fidelity climate model parameterization for better generalization
and extrapolation
- URL: http://arxiv.org/abs/2309.10231v1
- Date: Tue, 19 Sep 2023 01:03:39 GMT
- Title: Multi-fidelity climate model parameterization for better generalization
and extrapolation
- Authors: Mohamed Aziz Bhouri, Liran Peng, Michael S. Pritchard, Pierre Gentine
- Abstract summary: We show that a multi-fidelity approach, which integrates datasets of different accuracy and abundance, can provide the best of both worlds.
In an application to climate modeling, the multi-fidelity framework yields more accurate climate projections without requiring major increase in computational resources.
- Score: 0.3860305383611933
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Machine-learning-based parameterizations (i.e. representation of sub-grid
processes) of global climate models or turbulent simulations have recently been
proposed as a powerful alternative to physical, but empirical, representations,
offering a lower computational cost and higher accuracy. Yet, those approaches
still suffer from a lack of generalization and extrapolation beyond the
training data, which is however critical to projecting climate change or
unobserved regimes of turbulence. Here we show that a multi-fidelity approach,
which integrates datasets of different accuracy and abundance, can provide the
best of both worlds: the capacity to extrapolate leveraging the
physically-based parameterization and a higher accuracy using the
machine-learning-based parameterizations. In an application to climate
modeling, the multi-fidelity framework yields more accurate climate projections
without requiring major increase in computational resources. Our multi-fidelity
randomized prior networks (MF-RPNs) combine physical parameterization data as
low-fidelity and storm-resolving historical run's data as high-fidelity. To
extrapolate beyond the training data, the MF-RPNs are tested on high-fidelity
warming scenarios, $+4K$, data. We show the MF-RPN's capacity to return much
more skillful predictions compared to either low- or high-fidelity (historical
data) simulations trained only on one regime while providing trustworthy
uncertainty quantification across a wide range of scenarios. Our approach paves
the way for the use of machine-learning based methods that can optimally
leverage historical observations or high-fidelity simulations and extrapolate
to unseen regimes such as climate change.
Related papers
- On conditional diffusion models for PDE simulations [53.01911265639582]
We study score-based diffusion models for forecasting and assimilation of sparse observations.
We propose an autoregressive sampling approach that significantly improves performance in forecasting.
We also propose a new training strategy for conditional score-based models that achieves stable performance over a range of history lengths.
arXiv Detail & Related papers (2024-10-21T18:31:04Z) - Towards Physically Consistent Deep Learning For Climate Model Parameterizations [46.07009109585047]
parameterizations are a major source of systematic errors and large uncertainties in climate projections.
Deep learning (DL)-based parameterizations, trained on data from computationally expensive short, high-resolution simulations, have shown great promise for improving climate models.
We propose an efficient supervised learning framework for DL-based parameterizations that leads to physically consistent models.
arXiv Detail & Related papers (2024-06-06T10:02:49Z) - Diffusion posterior sampling for simulation-based inference in tall data settings [53.17563688225137]
Simulation-based inference ( SBI) is capable of approximating the posterior distribution that relates input parameters to a given observation.
In this work, we consider a tall data extension in which multiple observations are available to better infer the parameters of the model.
We compare our method to recently proposed competing approaches on various numerical experiments and demonstrate its superiority in terms of numerical stability and computational cost.
arXiv Detail & Related papers (2024-04-11T09:23:36Z) - Multifidelity linear regression for scientific machine learning from scarce data [0.0]
We propose a new multifidelity training approach for scientific machine learning via linear regression.
We provide bias and variance analysis of our new estimators that guarantee the approach's accuracy and improved robustness to scarce high-fidelity data.
arXiv Detail & Related papers (2024-03-13T15:40:17Z) - Federated Learning with Projected Trajectory Regularization [65.6266768678291]
Federated learning enables joint training of machine learning models from distributed clients without sharing their local data.
One key challenge in federated learning is to handle non-identically distributed data across the clients.
We propose a novel federated learning framework with projected trajectory regularization (FedPTR) for tackling the data issue.
arXiv Detail & Related papers (2023-12-22T02:12:08Z) - Personalized Federated Learning under Mixture of Distributions [98.25444470990107]
We propose a novel approach to Personalized Federated Learning (PFL), which utilizes Gaussian mixture models (GMM) to fit the input data distributions across diverse clients.
FedGMM possesses an additional advantage of adapting to new clients with minimal overhead, and it also enables uncertainty quantification.
Empirical evaluations on synthetic and benchmark datasets demonstrate the superior performance of our method in both PFL classification and novel sample detection.
arXiv Detail & Related papers (2023-05-01T20:04:46Z) - Multi-fidelity prediction of fluid flow and temperature field based on
transfer learning using Fourier Neural Operator [10.104417481736833]
This work proposes a novel multi-fidelity learning method based on the Fourier Neural Operator.
It uses abundant low-fidelity data and limited high-fidelity data under transfer learning paradigm.
Three typical fluid and temperature prediction problems are chosen to validate the accuracy of the proposed multi-fidelity model.
arXiv Detail & Related papers (2023-04-14T07:46:03Z) - ClimaX: A foundation model for weather and climate [51.208269971019504]
ClimaX is a deep learning model for weather and climate science.
It can be pre-trained with a self-supervised learning objective on climate datasets.
It can be fine-tuned to address a breadth of climate and weather tasks.
arXiv Detail & Related papers (2023-01-24T23:19:01Z) - Multi-fidelity surrogate modeling for temperature field prediction using
deep convolution neural network [8.98674326282801]
This paper proposes a pithy deep multi-fidelity model (DMFM) for temperature field prediction.
It takes advantage of low-fidelity data to boost the performance with less high-fidelity data.
A self-supervised learning method for training the physics-driven deep multi-fidelity model (PD-DMFM) is proposed.
arXiv Detail & Related papers (2023-01-17T03:13:45Z) - Multi-fidelity Hierarchical Neural Processes [79.0284780825048]
Multi-fidelity surrogate modeling reduces the computational cost by fusing different simulation outputs.
We propose Multi-fidelity Hierarchical Neural Processes (MF-HNP), a unified neural latent variable model for multi-fidelity surrogate modeling.
We evaluate MF-HNP on epidemiology and climate modeling tasks, achieving competitive performance in terms of accuracy and uncertainty estimation.
arXiv Detail & Related papers (2022-06-10T04:54:13Z) - Climate-Invariant Machine Learning [0.8831201550856289]
Current climate models require representations of processes that occur at scales smaller than model grid size.
Recent machine learning (ML) algorithms hold promise to improve such process representations, but tend to extrapolate poorly to climate regimes they were not trained on.
We propose a new framework - termed "climate-invariant" ML - incorporating knowledge of climate processes into ML algorithms.
arXiv Detail & Related papers (2021-12-14T07:02:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.