DUNE: A Machine Learning Deep UNet++ based Ensemble Approach to Monthly, Seasonal and Annual Climate Forecasting
- URL: http://arxiv.org/abs/2408.06262v1
- Date: Mon, 12 Aug 2024 16:22:30 GMT
- Title: DUNE: A Machine Learning Deep UNet++ based Ensemble Approach to Monthly, Seasonal and Annual Climate Forecasting
- Authors: Pratik Shukla, Milton Halem,
- Abstract summary: A novel Deep UNet++-based Ensemble (DUNE) neural architecture is introduced.
It produces the first AI-based global monthly, seasonal, or annual mean forecast of 2-meter temperatures (T2m) and sea surface temperatures (SST)
These forecasts outperform persistence, climatology, and multiple linear regression for all domains.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Capitalizing on the recent availability of ERA5 monthly averaged long-term data records of mean atmospheric and climate fields based on high-resolution reanalysis, deep-learning architectures offer an alternative to physics-based daily numerical weather predictions for subseasonal to seasonal (S2S) and annual means. A novel Deep UNet++-based Ensemble (DUNE) neural architecture is introduced, employing multi-encoder-decoder structures with residual blocks. When initialized from a prior month or year, this architecture produced the first AI-based global monthly, seasonal, or annual mean forecast of 2-meter temperatures (T2m) and sea surface temperatures (SST). ERA5 monthly mean data is used as input for T2m over land, SST over oceans, and solar radiation at the top of the atmosphere for each month of 40 years to train the model. Validation forecasts are performed for an additional two years, followed by five years of forecast evaluations to account for natural annual variability. AI-trained inference forecast weights generate forecasts in seconds, enabling ensemble seasonal forecasts. Root Mean Squared Error (RMSE), Anomaly Correlation Coefficient (ACC), and Heidke Skill Score (HSS) statistics are presented globally and over specific regions. These forecasts outperform persistence, climatology, and multiple linear regression for all domains. DUNE forecasts demonstrate comparable statistical accuracy to NOAA's operational monthly and seasonal probabilistic outlook forecasts over the US but at significantly higher resolutions. RMSE and ACC error statistics for other recent AI-based daily forecasts also show superior performance for DUNE-based forecasts. The DUNE model's application to an ensemble data assimilation cycle shows comparable forecast accuracy with a single high-resolution model, potentially eliminating the need for retraining on extrapolated datasets.
Related papers
- An ensemble of data-driven weather prediction models for operational sub-seasonal forecasting [0.08106028186803123]
We present an operations-ready multi-model ensemble weather forecasting system.
It is possible to achieve near-state-of-the-art subseasonal-to-seasonal forecasts using a multi-model ensembling approach with data-driven weather prediction models.
arXiv Detail & Related papers (2024-03-22T20:01:53Z) - ExtremeCast: Boosting Extreme Value Prediction for Global Weather Forecast [57.6987191099507]
We introduce Exloss, a novel loss function that performs asymmetric optimization and highlights extreme values to obtain accurate extreme weather forecast.
We also introduce ExBooster, which captures the uncertainty in prediction outcomes by employing multiple random samples.
Our solution can achieve state-of-the-art performance in extreme weather prediction, while maintaining the overall forecast accuracy comparable to the top medium-range forecast models.
arXiv Detail & Related papers (2024-02-02T10:34:13Z) - FengWu-GHR: Learning the Kilometer-scale Medium-range Global Weather
Forecasting [56.73502043159699]
This work presents FengWu-GHR, the first data-driven global weather forecasting model running at the 0.09$circ$ horizontal resolution.
It introduces a novel approach that opens the door for operating ML-based high-resolution forecasts by inheriting prior knowledge from a low-resolution model.
The hindcast of weather prediction in 2022 indicates that FengWu-GHR is superior to the IFS-HRES.
arXiv Detail & Related papers (2024-01-28T13:23:25Z) - Residual Corrective Diffusion Modeling for Km-scale Atmospheric Downscaling [58.456404022536425]
State of the art for physical hazard prediction from weather and climate requires expensive km-scale numerical simulations driven by coarser resolution global inputs.
Here, a generative diffusion architecture is explored for downscaling such global inputs to km-scale, as a cost-effective machine learning alternative.
The model is trained to predict 2km data from a regional weather model over Taiwan, conditioned on a 25km global reanalysis.
arXiv Detail & Related papers (2023-09-24T19:57:22Z) - Long-term drought prediction using deep neural networks based on geospatial weather data [75.38539438000072]
High-quality drought forecasting up to a year in advance is critical for agriculture planning and insurance.
We tackle drought data by introducing an end-to-end approach that adopts a systematic end-to-end approach.
Key findings are the exceptional performance of a Transformer model, EarthFormer, in making accurate short-term (up to six months) forecasts.
arXiv Detail & Related papers (2023-09-12T13:28:06Z) - Advancing Parsimonious Deep Learning Weather Prediction using the HEALPix Mesh [3.2785715577154595]
We present a parsimonious deep learning weather prediction model to forecast seven atmospheric variables with 3-h time resolution for up to one-year lead times on a 110-km global mesh.
In comparison to state-of-the-art (SOTA) machine learning (ML) weather forecast models, such as Pangu-Weather and GraphCast, our DLWP-HPX model uses coarser resolution and far fewer prognostic variables.
arXiv Detail & Related papers (2023-09-11T16:25:48Z) - W-MAE: Pre-trained weather model with masked autoencoder for
multi-variable weather forecasting [7.610811907813171]
We propose a Weather model with Masked AutoEncoder pre-training for weather forecasting.
W-MAE is pre-trained in a self-supervised manner to reconstruct spatial correlations within meteorological variables.
On the temporal scale, we fine-tune the pre-trained W-MAE to predict the future states of meteorological variables.
arXiv Detail & Related papers (2023-04-18T06:25:11Z) - Pangu-Weather: A 3D High-Resolution Model for Fast and Accurate Global
Weather Forecast [91.9372563527801]
We present Pangu-Weather, a deep learning based system for fast and accurate global weather forecast.
For the first time, an AI-based method outperforms state-of-the-art numerical weather prediction (NWP) methods in terms of accuracy.
Pangu-Weather supports a wide range of downstream forecast scenarios, including extreme weather forecast and large-member ensemble forecast in real-time.
arXiv Detail & Related papers (2022-11-03T17:19:43Z) - Sub-seasonal forecasting with a large ensemble of deep-learning weather
prediction models [6.882042556551611]
We present an ensemble prediction system using a Deep Learning Weather Prediction (DLWP) model.
This model uses convolutional neural networks (CNNs) on a cubed sphere grid to produce global forecasts.
Ensemble spread is primarily produced by randomizing the CNN training process to create a set of 32 DLWP models.
arXiv Detail & Related papers (2021-02-09T20:14:43Z) - A generative adversarial network approach to (ensemble) weather
prediction [91.3755431537592]
We use a conditional deep convolutional generative adversarial network to predict the geopotential height of the 500 hPa pressure level, the two-meter temperature and the total precipitation for the next 24 hours over Europe.
The proposed models are trained on 4 years of ERA5 reanalysis data from 2015-2018 with the goal to predict the associated meteorological fields in 2019.
arXiv Detail & Related papers (2020-06-13T20:53:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.