Conformal prediction for time series
- URL: http://arxiv.org/abs/2010.09107v14
- Date: Wed, 16 Jun 2021 14:03:23 GMT
- Title: Conformal prediction for time series
- Authors: Chen Xu, Yao Xie
- Abstract summary: textttEnbPI wraps around ensemble predictors, which is closely related to conformal prediction (CP) but does not require data exchangeability.
We perform extensive simulation and real-data analyses to demonstrate its effectiveness compared with existing methods.
- Score: 16.38369532102931
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We develop a general framework for constructing distribution-free prediction
intervals for time series. Theoretically, we establish explicit bounds on
conditional and marginal coverage gaps of estimated prediction intervals, which
asymptotically converge to zero under additional assumptions. We obtain similar
bounds on the size of set differences between oracle and estimated prediction
intervals. Methodologically, we introduce a computationally efficient algorithm
called \texttt{EnbPI} that wraps around ensemble predictors, which is closely
related to conformal prediction (CP) but does not require data exchangeability.
\texttt{EnbPI} avoids data-splitting and is computationally efficient by
avoiding retraining and thus scalable to sequentially producing prediction
intervals. We perform extensive simulation and real-data analyses to
demonstrate its effectiveness compared with existing methods. We also discuss
the extension of \texttt{EnbPI} on various other applications.
Related papers
- Building Conformal Prediction Intervals with Approximate Message Passing [14.951392270119461]
Conformal prediction is a powerful tool for building prediction intervals that are valid in a distribution-free way.
We propose a novel algorithm based on Approximate Message Passing (AMP) to accelerate the computation of prediction intervals.
We show that our method produces prediction intervals that are close to the baseline methods, while being orders of magnitude faster.
arXiv Detail & Related papers (2024-10-21T20:34:33Z) - Probabilistic Conformal Prediction with Approximate Conditional Validity [81.30551968980143]
We develop a new method for generating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution.
Our method consistently outperforms existing approaches in terms of conditional coverage.
arXiv Detail & Related papers (2024-07-01T20:44:48Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2023-10-17T20:30:16Z) - Prediction-Powered Inference [68.97619568620709]
Prediction-powered inference is a framework for performing valid statistical inference when an experimental dataset is supplemented with predictions from a machine-learning system.
The framework yields simple algorithms for computing provably valid confidence intervals for quantities such as means, quantiles, and linear and logistic regression coefficients.
Prediction-powered inference could enable researchers to draw valid and more data-efficient conclusions using machine learning.
arXiv Detail & Related papers (2023-01-23T18:59:28Z) - Sequential Predictive Conformal Inference for Time Series [16.38369532102931]
We present a new distribution-free conformal prediction algorithm for sequential data (e.g., time series)
We specifically account for the nature that time series data are non-exchangeable, and thus many existing conformal prediction algorithms are not applicable.
arXiv Detail & Related papers (2022-12-07T05:07:27Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2022-06-16T06:13:53Z) - Conformal prediction set for time-series [16.38369532102931]
Uncertainty quantification is essential to studying complex machine learning methods.
We develop Ensemble Regularized Adaptive Prediction Set (ERAPS) to construct prediction sets for time-series.
We show valid marginal and conditional coverage by ERAPS, which also tends to yield smaller prediction sets than competing methods.
arXiv Detail & Related papers (2022-06-15T23:48:53Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Cluster-and-Conquer: A Framework For Time-Series Forecasting [94.63501563413725]
We propose a three-stage framework for forecasting high-dimensional time-series data.
Our framework is highly general, allowing for any time-series forecasting and clustering method to be used in each step.
When instantiated with simple linear autoregressive models, we are able to achieve state-of-the-art results on several benchmark datasets.
arXiv Detail & Related papers (2021-10-26T20:41:19Z) - Applying Regression Conformal Prediction with Nearest Neighbors to time
series data [0.0]
This paper presents a way of constructingreliable prediction intervals by using conformal predictors in the context of time series data.
We use the nearest neighbors method based on the fast parameters tuning technique in the nearest neighbors (FPTO-WNN) approach as the underlying algorithm.
arXiv Detail & Related papers (2021-10-25T15:11:32Z) - Interpretable Machines: Constructing Valid Prediction Intervals with
Random Forests [0.0]
An important issue when using Machine Learning algorithms in recent research is the lack of interpretability.
A contribution to this gap for the Random Forest Regression Learner is presented here.
Several parametric and non-parametric prediction intervals are provided for Random Forest point predictions.
A thorough investigation through Monte-Carlo simulation is conducted evaluating the performance of the proposed methods.
arXiv Detail & Related papers (2021-03-09T23:05:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.