Quantifying uncertainty in climate projections with conformal ensembles
- URL: http://arxiv.org/abs/2408.06642v2
- Date: Thu, 15 Aug 2024 22:11:25 GMT
- Title: Quantifying uncertainty in climate projections with conformal ensembles
- Authors: Trevor Harris, Ryan Sriver,
- Abstract summary: Conformal ensembling is a new approach to uncertainty quantification in climate projections based on conformal inference to reduce projection uncertainty.
It can be applied to any climatic variable using any ensemble analysis method.
Experiments show that it is effective when conditioning future projections on historical reanalysis data.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Large climate model ensembles are the primary tool for robustly projecting future climate states and quantifying projection uncertainty. Despite significant advancements in climate modeling over the past few decades, overall projection certainty has not commensurately decreased with steadily improving model skill. We introduce conformal ensembling, a new approach to uncertainty quantification in climate projections based on conformal inference to reduce projection uncertainty. Unlike traditional methods, conformal ensembling seamlessly integrates climate model ensembles and observational data across a range of scales to generate statistically rigorous, easy-to-interpret uncertainty estimates. It can be applied to any climatic variable using any ensemble analysis method and outperforms existing inter-model variability methods in uncertainty quantification across all time horizons and most spatial locations under SSP2-4.5. Conformal ensembling is also computationally efficient, requires minimal assumptions, and is highly robust to the conformity measure. Experiments show that it is effective when conditioning future projections on historical reanalysis data compared with standard ensemble averaging approaches, yielding more physically consistent projections.
Related papers
- A Deconfounding Approach to Climate Model Bias Correction [26.68810227550602]
Global Climate Models (GCMs) are crucial for predicting future climate changes by simulating the Earth systems.
GCMs exhibit systematic biases due to model uncertainties, parameterization simplifications, and inadequate representation of complex climate phenomena.
This paper proposes a novel bias correction approach to utilize both GCM and observational data to learn a factor model that captures multi-cause latent confounders.
arXiv Detail & Related papers (2024-08-22T01:53:35Z) - Weather Prediction with Diffusion Guided by Realistic Forecast Processes [49.07556359513563]
We introduce a novel method that applies diffusion models (DM) for weather forecasting.
Our method can achieve both direct and iterative forecasting with the same modeling framework.
The flexibility and controllability of our model empowers a more trustworthy DL system for the general weather community.
arXiv Detail & Related papers (2024-02-06T21:28:42Z) - ExtremeCast: Boosting Extreme Value Prediction for Global Weather Forecast [57.6987191099507]
We introduce Exloss, a novel loss function that performs asymmetric optimization and highlights extreme values to obtain accurate extreme weather forecast.
We also introduce ExBooster, which captures the uncertainty in prediction outcomes by employing multiple random samples.
Our solution can achieve state-of-the-art performance in extreme weather prediction, while maintaining the overall forecast accuracy comparable to the top medium-range forecast models.
arXiv Detail & Related papers (2024-02-02T10:34:13Z) - Measuring and Modeling Uncertainty Degree for Monocular Depth Estimation [50.920911532133154]
The intrinsic ill-posedness and ordinal-sensitive nature of monocular depth estimation (MDE) models pose major challenges to the estimation of uncertainty degree.
We propose to model the uncertainty of MDE models from the perspective of the inherent probability distributions.
By simply introducing additional training regularization terms, our model, with surprisingly simple formations and without requiring extra modules or multiple inferences, can provide uncertainty estimations with state-of-the-art reliability.
arXiv Detail & Related papers (2023-07-19T12:11:15Z) - Deep Ensembles to Improve Uncertainty Quantification of Statistical
Downscaling Models under Climate Change Conditions [0.0]
We propose deep ensembles as a simple method to improve the uncertainty quantification of statistical downscaling models.
Deep ensembles allow for a better risk assessment, highly demanded by sectoral applications to tackle climate change.
arXiv Detail & Related papers (2023-04-27T19:53:18Z) - Toward Reliable Human Pose Forecasting with Uncertainty [51.628234388046195]
We develop an open-source library for human pose forecasting, including multiple models, supporting several datasets.
We devise two types of uncertainty in the problem to increase performance and convey better trust.
arXiv Detail & Related papers (2023-04-13T17:56:08Z) - A locally time-invariant metric for climate model ensemble predictions
of extreme risk [8.347190888362194]
We introduce a locally time-invariant method for evaluating climate model simulations with a focus on assessing the simulation of extremes.
We explore the behaviour of the proposed method in predicting extreme heat days in Nairobi and provide comparative results for eight additional cities.
arXiv Detail & Related papers (2022-11-26T16:41:50Z) - Uncertainty Quantification for Traffic Forecasting: A Unified Approach [21.556559649467328]
Uncertainty is an essential consideration for time series forecasting tasks.
In this work, we focus on quantifying the uncertainty of traffic forecasting.
We develop Deep S-Temporal Uncertainty Quantification (STUQ), which can estimate both aleatoric and relational uncertainty.
arXiv Detail & Related papers (2022-08-11T15:21:53Z) - When in Doubt: Neural Non-Parametric Uncertainty Quantification for
Epidemic Forecasting [70.54920804222031]
Most existing forecasting models disregard uncertainty quantification, resulting in mis-calibrated predictions.
Recent works in deep neural models for uncertainty-aware time-series forecasting also have several limitations.
We model the forecasting task as a probabilistic generative process and propose a functional neural process model called EPIFNP.
arXiv Detail & Related papers (2021-06-07T18:31:47Z) - Closer Look at the Uncertainty Estimation in Semantic Segmentation under
Distributional Shift [2.05617385614792]
Uncertainty estimation for the task of semantic segmentation is evaluated under a varying level of domain shift.
It was shown that simple color transformations already provide a strong baseline.
ensemble of models was utilized in the self-training setting to improve the pseudo-labels generation.
arXiv Detail & Related papers (2021-05-31T19:50:43Z) - Efficient Ensemble Model Generation for Uncertainty Estimation with
Bayesian Approximation in Segmentation [74.06904875527556]
We propose a generic and efficient segmentation framework to construct ensemble segmentation models.
In the proposed method, ensemble models can be efficiently generated by using the layer selection method.
We also devise a new pixel-wise uncertainty loss, which improves the predictive performance.
arXiv Detail & Related papers (2020-05-21T16:08:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.