Towards Physically Consistent Deep Learning For Climate Model Parameterizations
- URL: http://arxiv.org/abs/2406.03920v1
- Date: Thu, 6 Jun 2024 10:02:49 GMT
- Title: Towards Physically Consistent Deep Learning For Climate Model Parameterizations
- Authors: Birgit Kühbacher, Fernando Iglesias-Suarez, Niki Kilbertus, Veronika Eyring,
- Abstract summary: We propose an efficient supervised learning framework for deep learning-based parameterizations.
We show that our method robustly identifies a small subset of the inputs as actual physical drivers.
Our framework represents a crucial step in addressing a major challenge in data-driven climate model parameterizations.
- Score: 46.07009109585047
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Climate models play a critical role in understanding and projecting climate change. Due to their complexity, their horizontal resolution of ~40-100 km remains too coarse to resolve processes such as clouds and convection, which need to be approximated via parameterizations. These parameterizations are a major source of systematic errors and large uncertainties in climate projections. Deep learning (DL)-based parameterizations, trained on computationally expensive, short high-resolution simulations, have shown great promise for improving climate models in that regard. However, their lack of interpretability and tendency to learn spurious non-physical correlations result in reduced trust in the climate simulation. We propose an efficient supervised learning framework for DL-based parameterizations that leads to physically consistent models with improved interpretability and negligible computational overhead compared to standard supervised training. First, key features determining the target physical processes are uncovered. Subsequently, the neural network is fine-tuned using only those relevant features. We show empirically that our method robustly identifies a small subset of the inputs as actual physical drivers, therefore, removing spurious non-physical relationships. This results in by design physically consistent and interpretable neural networks while maintaining the predictive performance of standard black-box DL-based parameterizations. Our framework represents a crucial step in addressing a major challenge in data-driven climate model parameterizations by respecting the underlying physical processes, and may also benefit physically consistent deep learning in other research fields.
Related papers
- SMILE: Zero-Shot Sparse Mixture of Low-Rank Experts Construction From Pre-Trained Foundation Models [85.67096251281191]
We present an innovative approach to model fusion called zero-shot Sparse MIxture of Low-rank Experts (SMILE) construction.
SMILE allows for the upscaling of source models into an MoE model without extra data or further training.
We conduct extensive experiments across diverse scenarios, such as image classification and text generation tasks, using full fine-tuning and LoRA fine-tuning.
arXiv Detail & Related papers (2024-08-19T17:32:15Z) - Boosting Inference Efficiency: Unleashing the Power of Parameter-Shared
Pre-trained Language Models [109.06052781040916]
We introduce a technique to enhance the inference efficiency of parameter-shared language models.
We also propose a simple pre-training technique that leads to fully or partially shared models.
Results demonstrate the effectiveness of our methods on both autoregressive and autoencoding PLMs.
arXiv Detail & Related papers (2023-10-19T15:13:58Z) - Multi-fidelity climate model parameterization for better generalization
and extrapolation [0.3860305383611933]
We show that a multi-fidelity approach, which integrates datasets of different accuracy and abundance, can provide the best of both worlds.
In an application to climate modeling, the multi-fidelity framework yields more accurate climate projections without requiring major increase in computational resources.
arXiv Detail & Related papers (2023-09-19T01:03:39Z) - ClimaX: A foundation model for weather and climate [51.208269971019504]
ClimaX is a deep learning model for weather and climate science.
It can be pre-trained with a self-supervised learning objective on climate datasets.
It can be fine-tuned to address a breadth of climate and weather tasks.
arXiv Detail & Related papers (2023-01-24T23:19:01Z) - Physics-constrained deep learning postprocessing of temperature and
humidity [0.0]
We propose to achieve physical consistency in deep learning-based postprocessing models.
We find that constraining a neural network to enforce thermodynamic state equations yields physically-consistent predictions.
arXiv Detail & Related papers (2022-12-07T09:31:25Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Combining data assimilation and machine learning to estimate parameters
of a convective-scale model [0.0]
Errors in the representation of clouds in convection-permitting numerical weather prediction models can be introduced by different sources.
In this work, we look at the problem of parameter estimation through an artificial intelligence lens by training two types of artificial neural networks.
arXiv Detail & Related papers (2021-09-07T09:17:29Z) - Deep learning for improved global precipitation in numerical weather
prediction systems [1.721029532201972]
We use the UNET architecture of a deep convolutional neural network with residual learning as a proof of concept to learn global data-driven models of precipitation.
The results are compared with the operational dynamical model used by the India Meteorological Department.
This study is a proof-of-concept showing that residual learning-based UNET can unravel physical relationships to target precipitation.
arXiv Detail & Related papers (2021-06-20T05:10:42Z) - Hybrid Physics and Deep Learning Model for Interpretable Vehicle State
Prediction [75.1213178617367]
We propose a hybrid approach combining deep learning and physical motion models.
We achieve interpretability by restricting the output range of the deep neural network as part of the hybrid model.
The results show that our hybrid model can improve model interpretability with no decrease in accuracy compared to existing deep learning approaches.
arXiv Detail & Related papers (2021-03-11T15:21:08Z) - Machine Learning for Robust Identification of Complex Nonlinear
Dynamical Systems: Applications to Earth Systems Modeling [8.896888286819635]
Systems exhibiting chaos are ubiquitous across Earth Sciences.
System Identification remains a challenge in climate science.
We consider a chaotic system - two-level Lorenz-96 - used as a benchmark model in the climate science literature.
arXiv Detail & Related papers (2020-08-12T22:37:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.