Valid Error Bars for Neural Weather Models using Conformal Prediction
- URL: http://arxiv.org/abs/2406.14483v1
- Date: Thu, 20 Jun 2024 16:45:41 GMT
- Title: Valid Error Bars for Neural Weather Models using Conformal Prediction
- Authors: Vignesh Gopakumar, Joel Oskarrson, Ander Gray, Lorenzo Zanisi, Stanislas Pamela, Daniel Giles, Matt Kusner, Marc Deisenroth,
- Abstract summary: We construct and formalise a conformal prediction framework as a post-processing method for estimating uncertainty.
No modifications are required to the model and the computational cost is negligible compared to model training.
We demonstrate the usefulness of the conformal prediction framework on a limited area neural weather model for the Nordic region.
- Score: 0.1806830971023738
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural weather models have shown immense potential as inexpensive and accurate alternatives to physics-based models. However, most models trained to perform weather forecasting do not quantify the uncertainty associated with their forecasts. This limits the trust in the model and the usefulness of the forecasts. In this work we construct and formalise a conformal prediction framework as a post-processing method for estimating this uncertainty. The method is model-agnostic and gives calibrated error bounds for all variables, lead times and spatial locations. No modifications are required to the model and the computational cost is negligible compared to model training. We demonstrate the usefulness of the conformal prediction framework on a limited area neural weather model for the Nordic region. We further explore the advantages of the framework for deterministic and probabilistic models.
Related papers
- Uncertainty Quantification of Surrogate Models using Conformal Prediction [7.445864392018774]
We formalise a conformal prediction framework that satisfies predictions in a model-agnostic manner, requiring near-zero computational costs.
The paper looks at providing statistically valid error bars for deterministic models, as well as crafting guarantees to the error bars of probabilistic models.
arXiv Detail & Related papers (2024-08-19T10:46:19Z) - Uncertainty quantification for data-driven weather models [0.0]
We study and compare uncertainty quantification methods to generate probabilistic weather forecasts from a state-of-the-art deterministic data-driven weather model, Pangu-Weather.
Specifically, we compare approaches for quantifying forecast uncertainty based on generating ensemble forecasts via perturbations to the initial conditions.
In a case study on medium-range forecasts of selected weather variables over Europe, the probabilistic forecasts obtained by using the Pangu-Weather model in concert with uncertainty quantification methods show promising results.
arXiv Detail & Related papers (2024-03-20T10:07:51Z) - Predictive Churn with the Set of Good Models [64.05949860750235]
We study the effect of conflicting predictions over the set of near-optimal machine learning models.
We present theoretical results on the expected churn between models within the Rashomon set.
We show how our approach can be used to better anticipate, reduce, and avoid churn in consumer-facing applications.
arXiv Detail & Related papers (2024-02-12T16:15:25Z) - Weather Prediction with Diffusion Guided by Realistic Forecast Processes [49.07556359513563]
We introduce a novel method that applies diffusion models (DM) for weather forecasting.
Our method can achieve both direct and iterative forecasting with the same modeling framework.
The flexibility and controllability of our model empowers a more trustworthy DL system for the general weather community.
arXiv Detail & Related papers (2024-02-06T21:28:42Z) - ExtremeCast: Boosting Extreme Value Prediction for Global Weather Forecast [57.6987191099507]
We introduce Exloss, a novel loss function that performs asymmetric optimization and highlights extreme values to obtain accurate extreme weather forecast.
We also introduce ExBooster, which captures the uncertainty in prediction outcomes by employing multiple random samples.
Our solution can achieve state-of-the-art performance in extreme weather prediction, while maintaining the overall forecast accuracy comparable to the top medium-range forecast models.
arXiv Detail & Related papers (2024-02-02T10:34:13Z) - Attention-Based Ensemble Pooling for Time Series Forecasting [55.2480439325792]
We propose a method for pooling that performs a weighted average over candidate model forecasts.
We test this method on two time-series forecasting problems: multi-step forecasting of the dynamics of the non-stationary Lorenz 63 equation, and one-step forecasting of the weekly incident deaths due to COVID-19.
arXiv Detail & Related papers (2023-10-24T22:59:56Z) - Uncertainty estimation of pedestrian future trajectory using Bayesian
approximation [137.00426219455116]
Under dynamic traffic scenarios, planning based on deterministic predictions is not trustworthy.
The authors propose to quantify uncertainty during forecasting using approximation which deterministic approaches fail to capture.
The effect of dropout weights and long-term prediction on future state uncertainty has been studied.
arXiv Detail & Related papers (2022-05-04T04:23:38Z) - Evaluation of Machine Learning Techniques for Forecast Uncertainty
Quantification [0.13999481573773068]
Ensemble forecasting is, so far, the most successful approach to produce relevant forecasts along with an estimation of their uncertainty.
Main limitations of ensemble forecasting are the high computational cost and the difficulty to capture and quantify different sources of uncertainty.
In this work proof-of-concept model experiments are conducted to examine the performance of ANNs trained to predict a corrected state of the system and the state uncertainty using only a single deterministic forecast as input.
arXiv Detail & Related papers (2021-11-29T16:52:17Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.