Towards the Development of an Uncertainty Quantification Protocol for
the Natural Gas Industry
- URL: http://arxiv.org/abs/2308.02941v1
- Date: Sat, 5 Aug 2023 18:54:59 GMT
- Title: Towards the Development of an Uncertainty Quantification Protocol for
the Natural Gas Industry
- Authors: Babajide Kolade
- Abstract summary: Uncertainty estimates of simulation results are critical to the decision-making process.
This paper develops a protocol to assess uncertainties in predictions of machine learning and mechanistic simulation models.
It applies the protocol to test cases relevant to the gas distribution industry and presents learnings from its application.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Simulations using machine learning (ML) models and mechanistic models are
often run to inform decision-making processes. Uncertainty estimates of
simulation results are critical to the decision-making process because
simulation results of specific scenarios may have wide, but unspecified,
confidence bounds that may impact subsequent analyses and decisions. The
objective of this work is to develop a protocol to assess uncertainties in
predictions of machine learning and mechanistic simulation models. The protocol
will outline an uncertainty quantification workflow that may be used to
establish credible bounds of predictability on computed quantities of interest
and to assess model sufficiency. The protocol identifies key sources of
uncertainties in machine learning and mechanistic modeling, defines applicable
methods of uncertainty propagation for these sources, and includes
statistically rational estimators for output uncertainties. The work applies
the protocol to test cases relevant to the gas distribution industry and
presents learnings from its application. The paper concludes with a brief
discussion outlining a pathway to the wider adoption of uncertainty
quantification within the industry
Related papers
- Uncertainty measurement for complex event prediction in safety-critical systems [0.36832029288386137]
Complex events processing (CEP) uncertainty is critical for embedded and safety-critical systems.
This paper exemplifies how we can measure uncertainty for the perception and prediction of events.
We present and discuss our results, which are very promising within our field of research and work.
arXiv Detail & Related papers (2024-11-02T15:51:37Z) - A Probabilistic Perspective on Unlearning and Alignment for Large Language Models [48.96686419141881]
We introduce the first formal probabilistic evaluation framework in Large Language Models (LLMs)
We derive novel metrics with high-probability guarantees concerning the output distribution of a model.
Our metrics are application-independent and allow practitioners to make more reliable estimates about model capabilities before deployment.
arXiv Detail & Related papers (2024-10-04T15:44:23Z) - Decomposing Uncertainty for Large Language Models through Input Clarification Ensembling [69.83976050879318]
In large language models (LLMs), identifying sources of uncertainty is an important step toward improving reliability, trustworthiness, and interpretability.
In this paper, we introduce an uncertainty decomposition framework for LLMs, called input clarification ensembling.
Our approach generates a set of clarifications for the input, feeds them into an LLM, and ensembles the corresponding predictions.
arXiv Detail & Related papers (2023-11-15T05:58:35Z) - Quantification of Predictive Uncertainty via Inference-Time Sampling [57.749601811982096]
We propose a post-hoc sampling strategy for estimating predictive uncertainty accounting for data ambiguity.
The method can generate different plausible outputs for a given input and does not assume parametric forms of predictive distributions.
arXiv Detail & Related papers (2023-08-03T12:43:21Z) - Measuring and Modeling Uncertainty Degree for Monocular Depth Estimation [50.920911532133154]
The intrinsic ill-posedness and ordinal-sensitive nature of monocular depth estimation (MDE) models pose major challenges to the estimation of uncertainty degree.
We propose to model the uncertainty of MDE models from the perspective of the inherent probability distributions.
By simply introducing additional training regularization terms, our model, with surprisingly simple formations and without requiring extra modules or multiple inferences, can provide uncertainty estimations with state-of-the-art reliability.
arXiv Detail & Related papers (2023-07-19T12:11:15Z) - Physics-constrained Random Forests for Turbulence Model Uncertainty
Estimation [0.0]
We discuss a physics-constrained approach to account for uncertainty of turbulence models.
In order to eliminate user input, we incorporate a data-driven machine learning strategy.
arXiv Detail & Related papers (2023-06-23T08:44:56Z) - Quantifying Deep Learning Model Uncertainty in Conformal Prediction [1.4685355149711297]
Conformal Prediction is a promising framework for representing the model uncertainty.
In this paper, we explore state-of-the-art CP methodologies and their theoretical foundations.
arXiv Detail & Related papers (2023-06-01T16:37:50Z) - A Meta-heuristic Approach to Estimate and Explain Classifier Uncertainty [0.4264192013842096]
This work proposes a set of class-independent meta-heuristics that can characterize the complexity of an instance in terms of factors are mutually relevant to both human and machine learning decision-making.
The proposed measures and framework hold promise for improving model development for more complex instances, as well as providing a new means of model abstention and explanation.
arXiv Detail & Related papers (2023-04-20T13:09:28Z) - Dense Uncertainty Estimation via an Ensemble-based Conditional Latent
Variable Model [68.34559610536614]
We argue that the aleatoric uncertainty is an inherent attribute of the data and can only be correctly estimated with an unbiased oracle model.
We propose a new sampling and selection strategy at train time to approximate the oracle model for aleatoric uncertainty estimation.
Our results show that our solution achieves both accurate deterministic results and reliable uncertainty estimation.
arXiv Detail & Related papers (2021-11-22T08:54:10Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Predictive Capability Maturity Quantification using Bayesian Network [0.0]
In nuclear engineering, modeling and simulations (M&Ss) are widely applied to support risk-informed safety analysis.
Due to data gaps, validation becomes a decision-making process under uncertainties.
This paper suggests a framework "Predictive Capability Maturity Quantification using Bayesian network (PCMQBN)" as a quantified framework for assessing simulation adequacy.
arXiv Detail & Related papers (2020-08-31T17:09:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.