Physics-constrained deep learning postprocessing of temperature and
humidity
- URL: http://arxiv.org/abs/2212.04487v2
- Date: Fri, 19 May 2023 09:13:39 GMT
- Title: Physics-constrained deep learning postprocessing of temperature and
humidity
- Authors: Francesco Zanetta, Daniele Nerini, Tom Beucler and Mark A. Liniger
- Abstract summary: We propose to achieve physical consistency in deep learning-based postprocessing models.
We find that constraining a neural network to enforce thermodynamic state equations yields physically-consistent predictions.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Weather forecasting centers currently rely on statistical postprocessing
methods to minimize forecast error. This improves skill but can lead to
predictions that violate physical principles or disregard dependencies between
variables, which can be problematic for downstream applications and for the
trustworthiness of postprocessing models, especially when they are based on new
machine learning approaches. Building on recent advances in physics-informed
machine learning, we propose to achieve physical consistency in deep
learning-based postprocessing models by integrating meteorological expertise in
the form of analytic equations. Applied to the post-processing of surface
weather in Switzerland, we find that constraining a neural network to enforce
thermodynamic state equations yields physically-consistent predictions of
temperature and humidity without compromising performance. Our approach is
especially advantageous when data is scarce, and our findings suggest that
incorporating domain expertise into postprocessing models allows to optimize
weather forecast information while satisfying application-specific
requirements.
Related papers
- On conditional diffusion models for PDE simulations [53.01911265639582]
We study score-based diffusion models for forecasting and assimilation of sparse observations.
We propose an autoregressive sampling approach that significantly improves performance in forecasting.
We also propose a new training strategy for conditional score-based models that achieves stable performance over a range of history lengths.
arXiv Detail & Related papers (2024-10-21T18:31:04Z) - Weather Prediction Using CNN-LSTM for Time Series Analysis: A Case Study on Delhi Temperature Data [0.0]
This study explores a hybrid CNN-LSTM model to enhance temperature forecasting accuracy for the Delhi region.
We employed both direct and indirect methods, including comprehensive data preprocessing and exploratory analysis, to construct and train our model.
Experimental results indicate that the CNN-LSTM model significantly outperforms traditional forecasting methods in terms of both accuracy and stability.
arXiv Detail & Related papers (2024-09-14T11:06:07Z) - Towards Physically Consistent Deep Learning For Climate Model Parameterizations [46.07009109585047]
parameterizations are a major source of systematic errors and large uncertainties in climate projections.
Deep learning (DL)-based parameterizations, trained on data from computationally expensive short, high-resolution simulations, have shown great promise for improving climate models.
We propose an efficient supervised learning framework for DL-based parameterizations that leads to physically consistent models.
arXiv Detail & Related papers (2024-06-06T10:02:49Z) - Generalizing Weather Forecast to Fine-grained Temporal Scales via Physics-AI Hybrid Modeling [55.13352174687475]
This paper proposes a physics-AI hybrid model (i.e., WeatherGFT) which Generalizes weather forecasts to Finer-grained Temporal scales.
Specifically, we employ a carefully designed PDE kernel to simulate physical evolution on a small time scale.
We introduce a lead time-aware training framework to promote the generalization of the model at different lead times.
arXiv Detail & Related papers (2024-05-22T16:21:02Z) - Decomposing weather forecasting into advection and convection with neural networks [6.78786601630176]
We propose a simple yet effective machine learning model that learns the horizontal movement in the dynamical core and vertical movement in the physical parameterization separately.
Our model provides a new and efficient perspective to simulate the transition of variables in atmospheric models.
arXiv Detail & Related papers (2024-05-10T16:46:32Z) - ClimODE: Climate and Weather Forecasting with Physics-informed Neural ODEs [14.095897879222676]
We present ClimODE, a continuous-time process that implements key principle of statistical mechanics.
ClimODE models precise weather evolution with value-conserving dynamics, learning global weather transport as a neural flow.
Our approach outperforms existing data-driven methods in global, regional forecasting with an order of magnitude smaller parameterization.
arXiv Detail & Related papers (2024-04-15T06:38:21Z) - ExtremeCast: Boosting Extreme Value Prediction for Global Weather Forecast [57.6987191099507]
We introduce Exloss, a novel loss function that performs asymmetric optimization and highlights extreme values to obtain accurate extreme weather forecast.
We also introduce ExBooster, which captures the uncertainty in prediction outcomes by employing multiple random samples.
Our solution can achieve state-of-the-art performance in extreme weather prediction, while maintaining the overall forecast accuracy comparable to the top medium-range forecast models.
arXiv Detail & Related papers (2024-02-02T10:34:13Z) - ClimaX: A foundation model for weather and climate [51.208269971019504]
ClimaX is a deep learning model for weather and climate science.
It can be pre-trained with a self-supervised learning objective on climate datasets.
It can be fine-tuned to address a breadth of climate and weather tasks.
arXiv Detail & Related papers (2023-01-24T23:19:01Z) - Stabilizing Machine Learning Prediction of Dynamics: Noise and
Noise-inspired Regularization [58.720142291102135]
Recent has shown that machine learning (ML) models can be trained to accurately forecast the dynamics of chaotic dynamical systems.
In the absence of mitigating techniques, this technique can result in artificially rapid error growth, leading to inaccurate predictions and/or climate instability.
We introduce Linearized Multi-Noise Training (LMNT), a regularization technique that deterministically approximates the effect of many small, independent noise realizations added to the model input during training.
arXiv Detail & Related papers (2022-11-09T23:40:52Z) - Using Data Assimilation to Train a Hybrid Forecast System that Combines
Machine-Learning and Knowledge-Based Components [52.77024349608834]
We consider the problem of data-assisted forecasting of chaotic dynamical systems when the available data is noisy partial measurements.
We show that by using partial measurements of the state of the dynamical system, we can train a machine learning model to improve predictions made by an imperfect knowledge-based model.
arXiv Detail & Related papers (2021-02-15T19:56:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.