Invariant Subspace Decomposition
- URL: http://arxiv.org/abs/2404.09962v1
- Date: Mon, 15 Apr 2024 17:39:44 GMT
- Title: Invariant Subspace Decomposition
- Authors: Margherita Lazzaretto, Jonas Peters, Niklas Pfister,
- Abstract summary: We propose a novel framework for linear conditionals that splits the conditional distribution into a time-invariant and a residual time-dependent component.
We show that this decomposition can be utilized both for zero-shot and time-adaptation prediction tasks.
We propose a practical estimation procedure, which automatically infers the decomposition using tools from approximate joint matrix diagonalization.
- Score: 10.655331762491613
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We consider the task of predicting a response Y from a set of covariates X in settings where the conditional distribution of Y given X changes over time. For this to be feasible, assumptions on how the conditional distribution changes over time are required. Existing approaches assume, for example, that changes occur smoothly over time so that short-term prediction using only the recent past becomes feasible. In this work, we propose a novel invariance-based framework for linear conditionals, called Invariant Subspace Decomposition (ISD), that splits the conditional distribution into a time-invariant and a residual time-dependent component. As we show, this decomposition can be utilized both for zero-shot and time-adaptation prediction tasks, that is, settings where either no or a small amount of training data is available at the time points we want to predict Y at, respectively. We propose a practical estimation procedure, which automatically infers the decomposition using tools from approximate joint matrix diagonalization. Furthermore, we provide finite sample guarantees for the proposed estimator and demonstrate empirically that it indeed improves on approaches that do not use the additional invariant structure.
Related papers
- Probabilistic Conformal Prediction with Approximate Conditional Validity [81.30551968980143]
We develop a new method for generating prediction sets that combines the flexibility of conformal methods with an estimate of the conditional distribution.
Our method consistently outperforms existing approaches in terms of conditional coverage.
arXiv Detail & Related papers (2024-07-01T20:44:48Z) - Conformal time series decomposition with component-wise exchangeability [41.94295877935867]
We present a novel use of conformal prediction for time series forecasting that incorporates time series decomposition.
We find that the method provides promising results on well-structured time series, but can be limited by factors such as the decomposition step for more complex data.
arXiv Detail & Related papers (2024-06-24T16:23:30Z) - Relaxed Quantile Regression: Prediction Intervals for Asymmetric Noise [51.87307904567702]
Quantile regression is a leading approach for obtaining such intervals via the empirical estimation of quantiles in the distribution of outputs.
We propose Relaxed Quantile Regression (RQR), a direct alternative to quantile regression based interval construction that removes this arbitrary constraint.
We demonstrate that this added flexibility results in intervals with an improvement in desirable qualities.
arXiv Detail & Related papers (2024-06-05T13:36:38Z) - Variational Prediction [95.00085314353436]
We present a technique for learning a variational approximation to the posterior predictive distribution using a variational bound.
This approach can provide good predictive distributions without test time marginalization costs.
arXiv Detail & Related papers (2023-07-14T18:19:31Z) - Sequential Predictive Conformal Inference for Time Series [16.38369532102931]
We present a new distribution-free conformal prediction algorithm for sequential data (e.g., time series)
We specifically account for the nature that time series data are non-exchangeable, and thus many existing conformal prediction algorithms are not applicable.
arXiv Detail & Related papers (2022-12-07T05:07:27Z) - Conformal Inference for Online Prediction with Arbitrary Distribution
Shifts [1.2277343096128712]
We consider the problem of forming prediction sets in an online setting where the distribution generating the data is allowed to vary over time.
We develop a novel procedure with provably small regret over all local time intervals of a given width.
We test our techniques on two real-world datasets aimed at predicting stock market volatility and COVID-19 case counts.
arXiv Detail & Related papers (2022-08-17T16:51:12Z) - Conformal prediction set for time-series [16.38369532102931]
Uncertainty quantification is essential to studying complex machine learning methods.
We develop Ensemble Regularized Adaptive Prediction Set (ERAPS) to construct prediction sets for time-series.
We show valid marginal and conditional coverage by ERAPS, which also tends to yield smaller prediction sets than competing methods.
arXiv Detail & Related papers (2022-06-15T23:48:53Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Multivariate Probabilistic Regression with Natural Gradient Boosting [63.58097881421937]
We propose a Natural Gradient Boosting (NGBoost) approach based on nonparametrically modeling the conditional parameters of the multivariate predictive distribution.
Our method is robust, works out-of-the-box without extensive tuning, is modular with respect to the assumed target distribution, and performs competitively in comparison to existing approaches.
arXiv Detail & Related papers (2021-06-07T17:44:49Z) - Online Stochastic Convex Optimization: Wasserstein Distance Variation [15.313864176694832]
We consider an online proximal-gradient method to track the minimizers of expectations of smooth convex functions.
We revisit the concepts of estimation and tracking error inspired by systems and control literature.
We provide bounds for them under strong convexity, Lipschitzness of the gradient, and bounds on the probability distribution drift.
arXiv Detail & Related papers (2020-06-02T05:23:22Z) - Batch Stationary Distribution Estimation [98.18201132095066]
We consider the problem of approximating the stationary distribution of an ergodic Markov chain given a set of sampled transitions.
We propose a consistent estimator that is based on recovering a correction ratio function over the given data.
arXiv Detail & Related papers (2020-03-02T09:10:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.