Uncertainty Quantification Techniques for Space Weather Modeling:
Thermospheric Density Application
- URL: http://arxiv.org/abs/2201.02067v1
- Date: Thu, 6 Jan 2022 14:17:50 GMT
- Title: Uncertainty Quantification Techniques for Space Weather Modeling:
Thermospheric Density Application
- Authors: Richard J. Licata and Piyush M. Mehta
- Abstract summary: We propose two techniques to develop nonlinear ML models to predict thermospheric density.
We show the performance for models trained on local and global datasets.
We achieve errors of 11% on independent test data with well-calibrated uncertainty estimates.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Machine learning (ML) has often been applied to space weather (SW) problems
in recent years. SW originates from solar perturbations and is comprised of the
resulting complex variations they cause within the systems between the Sun and
Earth. These systems are tightly coupled and not well understood. This creates
a need for skillful models with knowledge about the confidence of their
predictions. One example of such a dynamical system is the thermosphere, the
neutral region of Earth's upper atmosphere. Our inability to forecast it has
severe repercussions in the context of satellite drag and collision avoidance
operations for objects in low Earth orbit. Even with (assumed) perfect driver
forecasts, our incomplete knowledge of the system results in often inaccurate
neutral mass density predictions. Continuing efforts are being made to improve
model accuracy, but density models rarely provide estimates of uncertainty. In
this work, we propose two techniques to develop nonlinear ML models to predict
thermospheric density while providing calibrated uncertainty estimates: Monte
Carlo (MC) dropout and direct prediction of the probability distribution, both
using the negative logarithm of predictive density (NLPD) loss function. We
show the performance for models trained on local and global datasets. This
shows that NLPD provides similar results for both techniques but the direct
probability method has a much lower computational cost. For the global model
regressed on the SET HASDM density database, we achieve errors of 11% on
independent test data with well-calibrated uncertainty estimates. Using an
in-situ CHAMP density dataset, both techniques provide test error on the order
of 13%. The CHAMP models (on independent data) are within 2% of perfect
calibration for all prediction intervals tested. This model can also be used to
obtain global predictions with uncertainties at a given epoch.
Related papers
- CoDiCast: Conditional Diffusion Model for Weather Prediction with Uncertainty Quantification [25.325450602084484]
CoDiCast is a conditional diffusion model to generate accurate global weather prediction.
It can generate 3-day global weather forecasts, at 6-hour steps and $5.625circ$-longitude, for over 5 variables, in about 12 minutes on a commodity A100 machine with 80GB memory.
arXiv Detail & Related papers (2024-09-09T18:18:47Z) - Valid Error Bars for Neural Weather Models using Conformal Prediction [0.1806830971023738]
We construct and formalise a conformal prediction framework as a post-processing method for estimating uncertainty.
No modifications are required to the model and the computational cost is negligible compared to model training.
We demonstrate the usefulness of the conformal prediction framework on a limited area neural weather model for the Nordic region.
arXiv Detail & Related papers (2024-06-20T16:45:41Z) - ExtremeCast: Boosting Extreme Value Prediction for Global Weather Forecast [57.6987191099507]
We introduce Exloss, a novel loss function that performs asymmetric optimization and highlights extreme values to obtain accurate extreme weather forecast.
We also introduce ExBooster, which captures the uncertainty in prediction outcomes by employing multiple random samples.
Our solution can achieve state-of-the-art performance in extreme weather prediction, while maintaining the overall forecast accuracy comparable to the top medium-range forecast models.
arXiv Detail & Related papers (2024-02-02T10:34:13Z) - Ensemble models outperform single model uncertainties and predictions
for operator-learning of hypersonic flows [43.148818844265236]
Training scientific machine learning (SciML) models on limited high-fidelity data offers one approach to rapidly predict behaviors for situations that have not been seen before.
High-fidelity data is itself in limited quantity to validate all outputs of the SciML model in unexplored input space.
We extend a DeepONet using three different uncertainty mechanisms: mean-variance estimation, evidential uncertainty, and ensembling.
arXiv Detail & Related papers (2023-10-31T18:07:29Z) - Residual Corrective Diffusion Modeling for Km-scale Atmospheric Downscaling [58.456404022536425]
State of the art for physical hazard prediction from weather and climate requires expensive km-scale numerical simulations driven by coarser resolution global inputs.
Here, a generative diffusion architecture is explored for downscaling such global inputs to km-scale, as a cost-effective machine learning alternative.
The model is trained to predict 2km data from a regional weather model over Taiwan, conditioned on a 25km global reanalysis.
arXiv Detail & Related papers (2023-09-24T19:57:22Z) - Confidence and Dispersity Speak: Characterising Prediction Matrix for
Unsupervised Accuracy Estimation [51.809741427975105]
This work aims to assess how well a model performs under distribution shifts without using labels.
We use the nuclear norm that has been shown to be effective in characterizing both properties.
We show that the nuclear norm is more accurate and robust in accuracy than existing methods.
arXiv Detail & Related papers (2023-02-02T13:30:48Z) - Science through Machine Learning: Quantification of Poststorm
Thermospheric Cooling [0.0]
We develop machine learning models to study the presence of post-storm cooling in the middle-thermosphere.
We find that both NRLS 2.0 and JB2008-ML do not account for post-storm cooling and perform poorly in periods following strong geomagnetic storms.
Results show that density reductions up to 40% can occur 1--3 days post-storm depending on location and the strength of the storm.
arXiv Detail & Related papers (2022-06-12T19:40:30Z) - Uncertainty estimation of pedestrian future trajectory using Bayesian
approximation [137.00426219455116]
Under dynamic traffic scenarios, planning based on deterministic predictions is not trustworthy.
The authors propose to quantify uncertainty during forecasting using approximation which deterministic approaches fail to capture.
The effect of dropout weights and long-term prediction on future state uncertainty has been studied.
arXiv Detail & Related papers (2022-05-04T04:23:38Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Machine-Learned HASDM Model with Uncertainty Quantification [0.0]
We develop the first thermospheric neutral mass density model with robust and reliable uncertainty estimates.
We compare the best HASDM-ML model to the U.S. Space Force's High Accuracy Satellite Drag Model database.
The model provides robust and reliable uncertainties in the density space over all space weather conditions.
arXiv Detail & Related papers (2021-09-16T01:06:44Z) - Improving Uncertainty Calibration via Prior Augmented Data [56.88185136509654]
Neural networks have proven successful at learning from complex data distributions by acting as universal function approximators.
They are often overconfident in their predictions, which leads to inaccurate and miscalibrated probabilistic predictions.
We propose a solution by seeking out regions of feature space where the model is unjustifiably overconfident, and conditionally raising the entropy of those predictions towards that of the prior distribution of the labels.
arXiv Detail & Related papers (2021-02-22T07:02:37Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.