Analyzing the effect of prediction accuracy on the distributionally-robust competitive ratio
- URL: http://arxiv.org/abs/2601.06813v1
- Date: Sun, 11 Jan 2026 08:51:40 GMT
- Title: Analyzing the effect of prediction accuracy on the distributionally-robust competitive ratio
- Authors: Toru Yoshinaga, Yasushi Kawase,
- Abstract summary: We focus on the distributionally-robust competitive ratio (DRCR), introduced by Sun et al.(ICML 2024)<n>A known structural property is that, for any fixed algorithm, the DRCR decreases linearly as prediction accuracy increases.<n>We apply our results to the ski rental problem, a benchmark problem in online optimization, to identify the conditions on prediction accuracies required for the optimal DRCR.
- Score: 3.616700653734583
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The field of algorithms with predictions aims to improve algorithm performance by integrating machine learning predictions into algorithm design. A central question in this area is how predictions can improve performance, and a key aspect of this analysis is the role of prediction accuracy. In this context, prediction accuracy is defined as a guaranteed probability that an instance drawn from the distribution belongs to the predicted set. As a performance measure that incorporates prediction accuracy, we focus on the distributionally-robust competitive ratio (DRCR), introduced by Sun et al.~(ICML 2024). The DRCR is defined as the expected ratio between the algorithm's cost and the optimal cost, where the expectation is taken over the worst-case instance distribution that satisfies the given prediction and accuracy requirement. A known structural property is that, for any fixed algorithm, the DRCR decreases linearly as prediction accuracy increases. Building on this result, we establish that the optimal DRCR value (i.e., the infimum over all algorithms) is a monotone and concave function of prediction accuracy. We further generalize the DRCR framework to a multiple-prediction setting and show that monotonicity and concavity are preserved in this setting. Finally, we apply our results to the ski rental problem, a benchmark problem in online optimization, to identify the conditions on prediction accuracies required for the optimal DRCR to attain a target value. Moreover, we provide a method for computing the critical accuracy, defined as the minimum accuracy required for the optimal DRCR to strictly improve upon the performance attainable without any accuracy guarantee.
Related papers
- A Switching Framework for Online Interval Scheduling with Predictions [4.352962539265558]
We study online interval scheduling in the irrevocable setting, where each interval must be immediately accepted or rejected upon arrival.<n>We consider this problem in a learning-augmented setting, where the algorithm has access to (machine-learned) predictions.<n>Our main contribution is the SemiTrust-and-Switch framework, which provides a unified approach for combining prediction-based and classical interval scheduling algorithms.
arXiv Detail & Related papers (2025-11-20T10:01:09Z) - Revisiting Multivariate Time Series Forecasting with Missing Values [65.30332997607141]
Missing values are common in real-world time series.<n>Current approaches have developed an imputation-then-prediction framework that uses imputation modules to fill in missing values, followed by forecasting on the imputed data.<n>This framework overlooks a critical issue: there is no ground truth for the missing values, making the imputation process susceptible to errors that can degrade prediction accuracy.<n>We introduce Consistency-Regularized Information Bottleneck (CRIB), a novel framework built on the Information Bottleneck principle.
arXiv Detail & Related papers (2025-09-27T20:57:48Z) - Persuasive Calibration [15.651406777700517]
We adopt the standard calibration framework that regulates predictions to be unbiased conditional on their own value.<n>We show that the optimal predictor is over-(resp. under-confident) confident for high (resp. low) true expected outcomes, while remaining perfectly in the middle.
arXiv Detail & Related papers (2025-04-04T06:49:56Z) - Calibrated Probabilistic Forecasts for Arbitrary Sequences [58.54729945445505]
Real-world data streams can change unpredictably due to distribution shifts, feedback loops and adversarial actors.<n>We present a forecasting framework ensuring valid uncertainty estimates regardless of how data evolves.
arXiv Detail & Related papers (2024-09-27T21:46:42Z) - Improving Accuracy Without Losing Interpretability: A ML Approach for
Time Series Forecasting [4.025941501724274]
In time series forecasting, decomposition-based algorithms break aggregate data into meaningful components.
Recent algorithms often combine machine learning (hereafter ML) methodology with decomposition to improve prediction accuracy.
We propose the W-R algorithm, a hybrid algorithm that combines decomposition and ML from a novel perspective.
arXiv Detail & Related papers (2022-12-13T14:51:10Z) - Efficient and Differentiable Conformal Prediction with General Function
Classes [96.74055810115456]
We propose a generalization of conformal prediction to multiple learnable parameters.
We show that it achieves approximate valid population coverage and near-optimal efficiency within class.
Experiments show that our algorithm is able to learn valid prediction sets and improve the efficiency significantly.
arXiv Detail & Related papers (2022-02-22T18:37:23Z) - Accurate Prediction and Uncertainty Estimation using Decoupled
Prediction Interval Networks [0.0]
We propose a network architecture capable of reliably estimating uncertainty of regression based predictions without sacrificing accuracy.
We achieve this by breaking down the learning of prediction and prediction interval (PI) estimations into a two-stage training process.
We compare the proposed method with current state-of-the-art uncertainty quantification algorithms on synthetic datasets and UCI benchmarks, reducing the error in the predictions by 23 to 34%.
arXiv Detail & Related papers (2022-02-19T19:31:36Z) - Robustification of Online Graph Exploration Methods [59.50307752165016]
We study a learning-augmented variant of the classical, notoriously hard online graph exploration problem.
We propose an algorithm that naturally integrates predictions into the well-known Nearest Neighbor (NN) algorithm.
arXiv Detail & Related papers (2021-12-10T10:02:31Z) - CovarianceNet: Conditional Generative Model for Correct Covariance
Prediction in Human Motion Prediction [71.31516599226606]
We present a new method to correctly predict the uncertainty associated with the predicted distribution of future trajectories.
Our approach, CovariaceNet, is based on a Conditional Generative Model with Gaussian latent variables.
arXiv Detail & Related papers (2021-09-07T09:38:24Z) - Optimized conformal classification using gradient descent approximation [0.2538209532048866]
Conformal predictors allow predictions to be made with a user-defined confidence level.
We consider an approach to train the conformal predictor directly with maximum predictive efficiency.
We test the method on several real world data sets and find that the method is promising.
arXiv Detail & Related papers (2021-05-24T13:14:41Z) - AutoCP: Automated Pipelines for Accurate Prediction Intervals [84.16181066107984]
This paper proposes an AutoML framework called Automatic Machine Learning for Conformal Prediction (AutoCP)
Unlike the familiar AutoML frameworks that attempt to select the best prediction model, AutoCP constructs prediction intervals that achieve the user-specified target coverage rate.
We tested AutoCP on a variety of datasets and found that it significantly outperforms benchmark algorithms.
arXiv Detail & Related papers (2020-06-24T23:13:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.