Uncertainty-Aware Performance Prediction for Highly Configurable
Software Systems via Bayesian Neural Networks
- URL: http://arxiv.org/abs/2212.13359v1
- Date: Tue, 27 Dec 2022 04:39:26 GMT
- Title: Uncertainty-Aware Performance Prediction for Highly Configurable
Software Systems via Bayesian Neural Networks
- Authors: Huong Ha, Zongwen Fan, Hongyu Zhang
- Abstract summary: We propose a Bayesian deep learning based method, namely BDLPerf, that can incorporate uncertainty into the prediction model.
We develop a novel uncertainty calibration technique to ensure the reliability of the confidence intervals generated by a Bayesian prediction model.
Our experimental results on 10 real-world systems show that BDLPerf achieves higher accuracy than existing approaches.
- Score: 12.607426130997336
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Configurable software systems are employed in many important application
domains. Understanding the performance of the systems under all configurations
is critical to prevent potential performance issues caused by misconfiguration.
However, as the number of configurations can be prohibitively large, it is not
possible to measure the system performance under all configurations. Thus, a
common approach is to build a prediction model from a limited measurement data
to predict the performance of all configurations as scalar values. However, it
has been pointed out that there are different sources of uncertainty coming
from the data collection or the modeling process, which can make the scalar
predictions not certainly accurate. To address this problem, we propose a
Bayesian deep learning based method, namely BDLPerf, that can incorporate
uncertainty into the prediction model. BDLPerf can provide both scalar
predictions for configurations' performance and the corresponding confidence
intervals of these scalar predictions. We also develop a novel uncertainty
calibration technique to ensure the reliability of the confidence intervals
generated by a Bayesian prediction model. Finally, we suggest an efficient
hyperparameter tuning technique so as to train the prediction model within a
reasonable amount of time whilst achieving high accuracy. Our experimental
results on 10 real-world systems show that BDLPerf achieves higher accuracy
than existing approaches, in both scalar performance prediction and confidence
interval estimation.
Related papers
- Provably Reliable Conformal Prediction Sets in the Presence of Data Poisoning [53.42244686183879]
Conformal prediction provides model-agnostic and distribution-free uncertainty quantification.
Yet, conformal prediction is not reliable under poisoning attacks where adversaries manipulate both training and calibration data.
We propose reliable prediction sets (RPS): the first efficient method for constructing conformal prediction sets with provable reliability guarantees under poisoning.
arXiv Detail & Related papers (2024-10-13T15:37:11Z) - Calibrated Probabilistic Forecasts for Arbitrary Sequences [58.54729945445505]
Real-world data streams can change unpredictably due to distribution shifts, feedback loops and adversarial actors.
We present a forecasting framework ensuring valid uncertainty estimates regardless of how data evolves.
arXiv Detail & Related papers (2024-09-27T21:46:42Z) - Multiclass Alignment of Confidence and Certainty for Network Calibration [10.15706847741555]
Recent studies reveal that deep neural networks (DNNs) are prone to making overconfident predictions.
We propose a new train-time calibration method, which features a simple, plug-and-play auxiliary loss known as multi-class alignment of predictive mean confidence and predictive certainty (MACC)
Our method achieves state-of-the-art calibration performance for both in-domain and out-domain predictions.
arXiv Detail & Related papers (2023-09-06T00:56:24Z) - Quantification of Predictive Uncertainty via Inference-Time Sampling [57.749601811982096]
We propose a post-hoc sampling strategy for estimating predictive uncertainty accounting for data ambiguity.
The method can generate different plausible outputs for a given input and does not assume parametric forms of predictive distributions.
arXiv Detail & Related papers (2023-08-03T12:43:21Z) - Conformal Prediction Intervals for Remaining Useful Lifetime Estimation [5.171601921549565]
We investigate the conformal prediction (CP) framework that represents uncertainty by predicting sets of possible values for the target variable.
CP formally guarantees that the actual value (true RUL) is covered by the predicted set with a degree of certainty that can be prespecified.
We study three CP algorithms to conformalize any single-point RUL predictor and turn it into a valid interval predictor.
arXiv Detail & Related papers (2022-12-30T09:34:29Z) - Probabilistic Deep Learning to Quantify Uncertainty in Air Quality
Forecasting [5.007231239800297]
This work applies state-of-the-art techniques of uncertainty quantification in a real-world setting of air quality forecasts.
We describe training probabilistic models and evaluate their predictive uncertainties based on empirical performance, reliability of confidence estimate, and practical applicability.
Our experiments demonstrate that the proposed models perform better than previous works in quantifying uncertainty in data-driven air quality forecasts.
arXiv Detail & Related papers (2021-12-05T17:01:18Z) - Evaluation of Machine Learning Techniques for Forecast Uncertainty
Quantification [0.13999481573773068]
Ensemble forecasting is, so far, the most successful approach to produce relevant forecasts along with an estimation of their uncertainty.
Main limitations of ensemble forecasting are the high computational cost and the difficulty to capture and quantify different sources of uncertainty.
In this work proof-of-concept model experiments are conducted to examine the performance of ANNs trained to predict a corrected state of the system and the state uncertainty using only a single deterministic forecast as input.
arXiv Detail & Related papers (2021-11-29T16:52:17Z) - Dense Uncertainty Estimation [62.23555922631451]
In this paper, we investigate neural networks and uncertainty estimation techniques to achieve both accurate deterministic prediction and reliable uncertainty estimation.
We work on two types of uncertainty estimations solutions, namely ensemble based methods and generative model based methods, and explain their pros and cons while using them in fully/semi/weakly-supervised framework.
arXiv Detail & Related papers (2021-10-13T01:23:48Z) - Towards More Fine-grained and Reliable NLP Performance Prediction [85.78131503006193]
We make two contributions to improving performance prediction for NLP tasks.
First, we examine performance predictors for holistic measures of accuracy like F1 or BLEU.
Second, we propose methods to understand the reliability of a performance prediction model from two angles: confidence intervals and calibration.
arXiv Detail & Related papers (2021-02-10T15:23:20Z) - Bayesian Optimization Meets Laplace Approximation for Robotic
Introspection [41.117361086267806]
We introduce a scalable Laplace Approximation (LA) technique to make Deep Neural Networks (DNNs) more introspective.
In particular, we propose a novel Bayesian Optimization (BO) algorithm to mitigate their tendency of under-fitting the true weight posterior.
We show that the proposed framework can be scaled up to large datasets and architectures.
arXiv Detail & Related papers (2020-10-30T09:28:10Z) - Unlabelled Data Improves Bayesian Uncertainty Calibration under
Covariate Shift [100.52588638477862]
We develop an approximate Bayesian inference scheme based on posterior regularisation.
We demonstrate the utility of our method in the context of transferring prognostic models of prostate cancer across globally diverse populations.
arXiv Detail & Related papers (2020-06-26T13:50:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.