Non-parametric Probabilistic Time Series Forecasting via Innovations
Representation
- URL: http://arxiv.org/abs/2306.03782v1
- Date: Mon, 5 Jun 2023 02:24:59 GMT
- Title: Non-parametric Probabilistic Time Series Forecasting via Innovations
Representation
- Authors: Xinyi Wang, Meijen Lee, Qing Zhao, Lang Tong
- Abstract summary: Probabilistic time series forecasting predicts the conditional probability distributions of the time series at a future time given past realizations.
Existing approaches are primarily based on parametric or semi-parametric time-series models that are restrictive, difficult to validate, and challenging to adapt to varying conditions.
This paper proposes a nonparametric method based on the classic notion of em innovations pioneered by Norbert Wiener and Gopinath Kallianpur.
- Score: 29.255644836978956
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Probabilistic time series forecasting predicts the conditional probability
distributions of the time series at a future time given past realizations. Such
techniques are critical in risk-based decision-making and planning under
uncertainties. Existing approaches are primarily based on parametric or
semi-parametric time-series models that are restrictive, difficult to validate,
and challenging to adapt to varying conditions. This paper proposes a
nonparametric method based on the classic notion of {\em innovations} pioneered
by Norbert Wiener and Gopinath Kallianpur that causally transforms a
nonparametric random process to an independent and identical uniformly
distributed {\em innovations process}. We present a machine-learning
architecture and a learning algorithm that circumvent two limitations of the
original Wiener-Kallianpur innovations representation: (i) the need for known
probability distributions of the time series and (ii) the existence of a causal
decoder that reproduces the original time series from the innovations
representation. We develop a deep-learning approach and a Monte Carlo sampling
technique to obtain a generative model for the predicted conditional
probability distribution of the time series based on a weak notion of
Wiener-Kallianpur innovations representation. The efficacy of the proposed
probabilistic forecasting technique is demonstrated on a variety of electricity
price datasets, showing marked improvement over leading benchmarks of
probabilistic forecasting techniques.
Related papers
- ProGen: Revisiting Probabilistic Spatial-Temporal Time Series Forecasting from a Continuous Generative Perspective Using Stochastic Differential Equations [18.64802090861607]
ProGen Pro provides a robust solution that effectively captures dependencies while managing uncertainty.
Our experiments on four benchmark traffic datasets demonstrate that ProGen Pro outperforms state-of-the-art deterministic probabilistic models.
arXiv Detail & Related papers (2024-11-02T14:37:30Z) - Conformal Generative Modeling with Improved Sample Efficiency through Sequential Greedy Filtering [55.15192437680943]
Generative models lack rigorous statistical guarantees for their outputs.
We propose a sequential conformal prediction method producing prediction sets that satisfy a rigorous statistical guarantee.
This guarantee states that with high probability, the prediction sets contain at least one admissible (or valid) example.
arXiv Detail & Related papers (2024-10-02T15:26:52Z) - Probabilistic Forecasting of Real-Time Electricity Market Signals via Interpretable Generative AI [41.99446024585741]
We present WIAE-GPF, a Weak Innovation AutoEncoder-based Generative Probabilistic Forecasting architecture.
A novel learning algorithm with structural convergence guarantees is proposed, ensuring that the generated forecast samples match the ground truth conditional probability distribution.
arXiv Detail & Related papers (2024-03-09T00:41:30Z) - Generative Probabilistic Time Series Forecasting and Applications in
Grid Operations [47.19756484695248]
Generative probabilistic forecasting produces future time series samples according to the conditional probability distribution given past time series observations.
We propose a weak innovation autoencoder architecture and a learning algorithm to extract independent and identically distributed innovation sequences.
We show that the weak innovation sequence is Bayesian sufficient, which makes the proposed weak innovation autoencoder a canonical architecture for generative probabilistic forecasting.
arXiv Detail & Related papers (2024-02-21T15:23:21Z) - Score Matching-based Pseudolikelihood Estimation of Neural Marked
Spatio-Temporal Point Process with Uncertainty Quantification [59.81904428056924]
We introduce SMASH: a Score MAtching estimator for learning markedPs with uncertainty quantification.
Specifically, our framework adopts a normalization-free objective by estimating the pseudolikelihood of markedPs through score-matching.
The superior performance of our proposed framework is demonstrated through extensive experiments in both event prediction and uncertainty quantification.
arXiv Detail & Related papers (2023-10-25T02:37:51Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Quantifying Uncertainty in Deep Spatiotemporal Forecasting [67.77102283276409]
We describe two types of forecasting problems: regular grid-based and graph-based.
We analyze UQ methods from both the Bayesian and the frequentist point view, casting in a unified framework via statistical decision theory.
Through extensive experiments on real-world road network traffic, epidemics, and air quality forecasting tasks, we reveal the statistical computational trade-offs for different UQ methods.
arXiv Detail & Related papers (2021-05-25T14:35:46Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z) - Remaining Useful Life Estimation Under Uncertainty with Causal GraphNets [0.0]
A novel approach for the construction and training of time series models is presented.
The proposed method is appropriate for constructing predictive models for non-stationary time series.
arXiv Detail & Related papers (2020-11-23T21:28:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.