Ensembling geophysical models with Bayesian Neural Networks
- URL: http://arxiv.org/abs/2010.03561v1
- Date: Wed, 7 Oct 2020 18:32:32 GMT
- Title: Ensembling geophysical models with Bayesian Neural Networks
- Authors: Ushnish Sengupta, Matt Amos, J. Scott Hosking, Carl Edward Rasmussen,
Matthew Juniper, Paul J. Young
- Abstract summary: We develop a novel data-driven ensembling strategy for combining geophysical models.
BayNNE outperforms existing ensembling methods, achieving a 49.4% reduction in RMSE for temporal extrapolation.
Uncertainty is also well-characterized, with 90.6% of the data points in our validation dataset lying within 2 standard deviations.
- Score: 11.972384567130268
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Ensembles of geophysical models improve projection accuracy and express
uncertainties. We develop a novel data-driven ensembling strategy for combining
geophysical models using Bayesian Neural Networks, which infers
spatiotemporally varying model weights and bias while accounting for
heteroscedastic uncertainties in the observations. This produces more accurate
and uncertainty-aware projections without sacrificing interpretability. Applied
to the prediction of total column ozone from an ensemble of 15
chemistry-climate models, we find that the Bayesian neural network ensemble
(BayNNE) outperforms existing ensembling methods, achieving a 49.4% reduction
in RMSE for temporal extrapolation, and a 67.4% reduction in RMSE for polar
data voids, compared to a weighted mean. Uncertainty is also
well-characterized, with 90.6% of the data points in our extrapolation
validation dataset lying within 2 standard deviations and 98.5% within 3
standard deviations.
Related papers
- Hybrid twinning using PBDW and DeepONet for the effective state estimation and prediction on partially known systems [0.0]
We propose an effective hybrid approach that combines physics-based modeling with data-driven learning to enhance state estimation.<n>We validate the proposed approach on a representative problem involving the Helmholtz equation.
arXiv Detail & Related papers (2025-12-03T12:19:00Z) - Bridging the Gap Between Bayesian Deep Learning and Ensemble Weather Forecasts [100.26854618129039]
Weather forecasting is fundamentally challenged by the chaotic nature of the atmosphere.<n>Recent advances in Bayesian Deep Learning (BDL) offer a promising but often disconnected alternative.<n>We bridge these paradigms through a unified hybrid BDL framework for ensemble weather forecasting.
arXiv Detail & Related papers (2025-11-18T07:49:52Z) - ReconMOST: Multi-Layer Sea Temperature Reconstruction with Observations-Guided Diffusion [48.540756751934836]
ReconMOST is a data-driven guided diffusion model framework for multi-layer sea temperature reconstruction.<n>Our method extends ML-based SST reconstruction to a global, multi-layer setting, handling over 92.5% missing data.
arXiv Detail & Related papers (2025-06-12T06:27:22Z) - Cooperative Bayesian and variance networks disentangle aleatoric and epistemic uncertainties [0.0]
Real-world data contains aleatoric uncertainty - irreducible noise arising from imperfect measurements or from incomplete knowledge about the data generation process.<n>Mean variance estimation (MVE) networks can learn this type of uncertainty but require ad-hoc regularization strategies to avoid overfitting.<n>We propose to train a variance network with a Bayesian neural network and demonstrate that the resulting model disentangles aleatoric and epistemic uncertainties while improving the mean estimation.
arXiv Detail & Related papers (2025-05-05T15:50:52Z) - Long-term drought prediction using deep neural networks based on geospatial weather data [75.38539438000072]
High-quality drought forecasting up to a year in advance is critical for agriculture planning and insurance.
We tackle drought data by introducing an end-to-end approach that adopts a systematic end-to-end approach.
Key findings are the exceptional performance of a Transformer model, EarthFormer, in making accurate short-term (up to six months) forecasts.
arXiv Detail & Related papers (2023-09-12T13:28:06Z) - Jensen-Shannon Divergence Based Novel Loss Functions for Bayesian Neural Networks [2.4554686192257424]
We formulate a novel loss function for BNNs based on a new modification to the generalized Jensen-Shannon (JS) divergence, which is bounded.
We find that the JS divergence-based variational inference is intractable, and hence employed a constrained optimization framework to formulate these losses.
Our theoretical analysis and empirical experiments on multiple regression and classification data sets suggest that the proposed losses perform better than the KL divergence-based loss, especially when the data sets are noisy or biased.
arXiv Detail & Related papers (2022-09-23T01:47:09Z) - Uncertainty-guided Source-free Domain Adaptation [77.3844160723014]
Source-free domain adaptation (SFDA) aims to adapt a classifier to an unlabelled target data set by only using a pre-trained source model.
We propose quantifying the uncertainty in the source model predictions and utilizing it to guide the target adaptation.
arXiv Detail & Related papers (2022-08-16T08:03:30Z) - Uncertainty Quantification Techniques for Space Weather Modeling:
Thermospheric Density Application [0.0]
We propose two techniques to develop nonlinear ML models to predict thermospheric density.
We show the performance for models trained on local and global datasets.
We achieve errors of 11% on independent test data with well-calibrated uncertainty estimates.
arXiv Detail & Related papers (2022-01-06T14:17:50Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Quantifying Model Predictive Uncertainty with Perturbation Theory [21.591460685054546]
We propose a framework for predictive uncertainty quantification of a neural network.
We use perturbation theory from quantum physics to formulate a moment decomposition problem.
Our approach provides fast model predictive uncertainty estimates with much greater precision and calibration.
arXiv Detail & Related papers (2021-09-22T17:55:09Z) - Calibration and Uncertainty Quantification of Bayesian Convolutional
Neural Networks for Geophysical Applications [0.0]
It is common to incorporate the uncertainty of predictions such subsurface models should provide calibrated probabilities and the associated uncertainties in their predictions.
It has been shown that popular Deep Learning-based models are often miscalibrated, and due to their deterministic nature, provide no means to interpret the uncertainty of their predictions.
We compare three different approaches obtaining probabilistic models based on convolutional neural networks in a Bayesian formalism.
arXiv Detail & Related papers (2021-05-25T17:54:23Z) - Improving Uncertainty Calibration via Prior Augmented Data [56.88185136509654]
Neural networks have proven successful at learning from complex data distributions by acting as universal function approximators.
They are often overconfident in their predictions, which leads to inaccurate and miscalibrated probabilistic predictions.
We propose a solution by seeking out regions of feature space where the model is unjustifiably overconfident, and conditionally raising the entropy of those predictions towards that of the prior distribution of the labels.
arXiv Detail & Related papers (2021-02-22T07:02:37Z) - Bayesian Graph Neural Networks for Molecular Property Prediction [15.160090982544867]
This study benchmarks a set of Bayesian methods applied to a directed MPNN, using the QM9 regression dataset.
We find that capturing uncertainty in both readout and message passing parameters yields enhanced predictive accuracy, calibration, and performance on a downstream molecular search task.
arXiv Detail & Related papers (2020-11-25T22:32:54Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z) - Uncertainty Estimation Using a Single Deep Deterministic Neural Network [66.26231423824089]
We propose a method for training a deterministic deep model that can find and reject out of distribution data points at test time with a single forward pass.
We scale training in these with a novel loss function and centroid updating scheme and match the accuracy of softmax models.
arXiv Detail & Related papers (2020-03-04T12:27:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.