Climate-Invariant Machine Learning
- URL: http://arxiv.org/abs/2112.08440v5
- Date: Wed, 17 Jan 2024 22:41:32 GMT
- Title: Climate-Invariant Machine Learning
- Authors: Tom Beucler, Pierre Gentine, Janni Yuval, Ankitesh Gupta, Liran Peng,
Jerry Lin, Sungduk Yu, Stephan Rasp, Fiaz Ahmed, Paul A. O'Gorman, J. David
Neelin, Nicholas J. Lutsko, Michael Pritchard
- Abstract summary: Current climate models require representations of processes that occur at scales smaller than model grid size.
Recent machine learning (ML) algorithms hold promise to improve such process representations, but tend to extrapolate poorly to climate regimes they were not trained on.
We propose a new framework - termed "climate-invariant" ML - incorporating knowledge of climate processes into ML algorithms.
- Score: 0.8831201550856289
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Projecting climate change is a generalization problem: we extrapolate the
recent past using physical models across past, present, and future climates.
Current climate models require representations of processes that occur at
scales smaller than model grid size, which have been the main source of model
projection uncertainty. Recent machine learning (ML) algorithms hold promise to
improve such process representations, but tend to extrapolate poorly to climate
regimes they were not trained on. To get the best of the physical and
statistical worlds, we propose a new framework - termed "climate-invariant" ML
- incorporating knowledge of climate processes into ML algorithms, and show
that it can maintain high offline accuracy across a wide range of climate
conditions and configurations in three distinct atmospheric models. Our results
suggest that explicitly incorporating physical knowledge into data-driven
models of Earth system processes can improve their consistency, data
efficiency, and generalizability across climate regimes.
Related papers
- Modeling chaotic Lorenz ODE System using Scientific Machine Learning [1.4633779950109127]
In this paper, we have integrated Scientific Machine Learning (SciML) methods into foundational weather models.
By combining the interpretability of physical climate models with the computational power of neural networks, SciML models can prove to be a reliable tool for modeling climate.
arXiv Detail & Related papers (2024-10-09T01:17:06Z) - Efficient Localized Adaptation of Neural Weather Forecasting: A Case Study in the MENA Region [62.09891513612252]
We focus on limited-area modeling and train our model specifically for localized region-level downstream tasks.
We consider the MENA region due to its unique climatic challenges, where accurate localized weather forecasting is crucial for managing water resources, agriculture and mitigating the impacts of extreme weather events.
Our study aims to validate the effectiveness of integrating parameter-efficient fine-tuning (PEFT) methodologies, specifically Low-Rank Adaptation (LoRA) and its variants, to enhance forecast accuracy, as well as training speed, computational resource utilization, and memory efficiency in weather and climate modeling for specific regions.
arXiv Detail & Related papers (2024-09-11T19:31:56Z) - MambaDS: Near-Surface Meteorological Field Downscaling with Topography Constrained Selective State Space Modeling [68.69647625472464]
Downscaling, a crucial task in meteorological forecasting, enables the reconstruction of high-resolution meteorological states for target regions.
Previous downscaling methods lacked tailored designs for meteorology and encountered structural limitations.
We propose a novel model called MambaDS, which enhances the utilization of multivariable correlations and topography information.
arXiv Detail & Related papers (2024-08-20T13:45:49Z) - Machine Learning Global Simulation of Nonlocal Gravity Wave Propagation [1.3108798582758452]
We present the first-ever global simulation of atmospheric mesoscale processes using machine learning (ML) models trained on the WINDSET dataset.
Using an Attention U-Net-based architecture trained on globally resolved GW momentum, we illustrate the importance and effectiveness of global nonlocality.
arXiv Detail & Related papers (2024-06-20T22:57:38Z) - Towards Physically Consistent Deep Learning For Climate Model Parameterizations [46.07009109585047]
parameterizations are a major source of systematic errors and large uncertainties in climate projections.
Deep learning (DL)-based parameterizations, trained on data from computationally expensive short, high-resolution simulations, have shown great promise for improving climate models.
We propose an efficient supervised learning framework for DL-based parameterizations that leads to physically consistent models.
arXiv Detail & Related papers (2024-06-06T10:02:49Z) - Comparing Data-Driven and Mechanistic Models for Predicting Phenology in
Deciduous Broadleaf Forests [47.285748922842444]
We train a deep neural network to predict a phenological index from meteorological time series.
We find that this approach outperforms traditional process-based models.
arXiv Detail & Related papers (2024-01-08T15:29:23Z) - ClimateSet: A Large-Scale Climate Model Dataset for Machine Learning [26.151056828513962]
Climate models have been key for assessing the impact of climate change and simulating future climate scenarios.
The machine learning (ML) community has taken an increased interest in supporting climate scientists' efforts on various tasks such as climate model emulation, downscaling, and prediction tasks.
Here, we introduce ClimateSet, a dataset containing the inputs and outputs of 36 climate models from the Input4MIPs and CMIP6 archives.
arXiv Detail & Related papers (2023-11-07T04:55:36Z) - Multi-fidelity climate model parameterization for better generalization
and extrapolation [0.3860305383611933]
We show that a multi-fidelity approach, which integrates datasets of different accuracy and abundance, can provide the best of both worlds.
In an application to climate modeling, the multi-fidelity framework yields more accurate climate projections without requiring major increase in computational resources.
arXiv Detail & Related papers (2023-09-19T01:03:39Z) - ClimaX: A foundation model for weather and climate [51.208269971019504]
ClimaX is a deep learning model for weather and climate science.
It can be pre-trained with a self-supervised learning objective on climate datasets.
It can be fine-tuned to address a breadth of climate and weather tasks.
arXiv Detail & Related papers (2023-01-24T23:19:01Z) - Spatiotemporal modeling of European paleoclimate using doubly sparse
Gaussian processes [61.31361524229248]
We build on recent scale sparsetemporal GPs to reduce the computational burden.
We successfully employ such a doubly sparse GP to construct a probabilistic model of paleoclimate.
arXiv Detail & Related papers (2022-11-15T14:15:04Z) - HECT: High-Dimensional Ensemble Consistency Testing for Climate Models [1.7587442088965226]
Climate models play a crucial role in understanding the effect of environmental changes on climate to help mitigate climate risks and inform decisions.
Large global climate models such as the Community Earth System Model (CESM), are very complex with millions of lines of code describing interactions of the atmosphere, land, oceans, and ice.
Our work uses probabilistics like tree-based algorithms and deep neural networks to perform a statistically rigorous goodness-of-fit test of high-dimensional and man-made data.
arXiv Detail & Related papers (2020-10-08T15:16:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.