Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting
- URL: http://arxiv.org/abs/2102.00397v1
- Date: Sun, 31 Jan 2021 06:49:33 GMT
- Title: Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting
- Authors: Longyuan Li, Junchi Yan, Xiaokang Yang, and Yaohui Jin
- Abstract summary: Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
- Score: 98.57851612518758
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Probabilistic time series forecasting involves estimating the distribution of
future based on its history, which is essential for risk management in
downstream decision-making. We propose a deep state space model for
probabilistic time series forecasting whereby the non-linear emission model and
transition model are parameterized by networks and the dependency is modeled by
recurrent neural nets. We take the automatic relevance determination (ARD) view
and devise a network to exploit the exogenous variables in addition to time
series. In particular, our ARD network can incorporate the uncertainty of the
exogenous variables and eventually helps identify useful exogenous variables
and suppress those irrelevant for forecasting. The distribution of multi-step
ahead forecasts are approximated by Monte Carlo simulation. We show in
experiments that our model produces accurate and sharp probabilistic forecasts.
The estimated uncertainty of our forecasting also realistically increases over
time, in a spontaneous manner.
Related papers
- Weather Prediction with Diffusion Guided by Realistic Forecast Processes [49.07556359513563]
We introduce a novel method that applies diffusion models (DM) for weather forecasting.
Our method can achieve both direct and iterative forecasting with the same modeling framework.
The flexibility and controllability of our model empowers a more trustworthy DL system for the general weather community.
arXiv Detail & Related papers (2024-02-06T21:28:42Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2023-10-17T20:30:16Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2022-06-16T06:13:53Z) - Uncertainty estimation of pedestrian future trajectory using Bayesian
approximation [137.00426219455116]
Under dynamic traffic scenarios, planning based on deterministic predictions is not trustworthy.
The authors propose to quantify uncertainty during forecasting using approximation which deterministic approaches fail to capture.
The effect of dropout weights and long-term prediction on future state uncertainty has been studied.
arXiv Detail & Related papers (2022-05-04T04:23:38Z) - Probabilistic Forecasting with Generative Networks via Scoring Rule
Minimization [5.5643498845134545]
We use generative neural networks to parametrize distributions on high-dimensional spaces by transforming draws from a latent variable.
We train generative networks to minimize a predictive-sequential (or prequential) scoring rule on a recorded temporal sequence of the phenomenon of interest.
Our method outperforms state-of-the-art adversarial approaches, especially in probabilistic calibration.
arXiv Detail & Related papers (2021-12-15T15:51:12Z) - Probabilistic Time Series Forecasting with Implicit Quantile Networks [0.7249731529275341]
We combine an autoregressive recurrent neural network to model temporal dynamics with Implicit Quantile Networks to learn a large class of distributions over a time-series target.
Our approach is favorable in terms of point-wise prediction accuracy as well as on estimating the underlying temporal distribution.
arXiv Detail & Related papers (2021-07-08T10:37:24Z) - RNN with Particle Flow for Probabilistic Spatio-temporal Forecasting [30.277213545837924]
Many classical statistical models often fall short in handling the complexity and high non-linearity present in time-series data.
In this work, we consider the time-series data as a random realization from a nonlinear state-space model.
We use particle flow as the tool for approximating the posterior distribution of the states, as it is shown to be highly effective in complex, high-dimensional settings.
arXiv Detail & Related papers (2021-06-10T21:49:23Z) - When in Doubt: Neural Non-Parametric Uncertainty Quantification for
Epidemic Forecasting [70.54920804222031]
Most existing forecasting models disregard uncertainty quantification, resulting in mis-calibrated predictions.
Recent works in deep neural models for uncertainty-aware time-series forecasting also have several limitations.
We model the forecasting task as a probabilistic generative process and propose a functional neural process model called EPIFNP.
arXiv Detail & Related papers (2021-06-07T18:31:47Z) - Deep Distributional Time Series Models and the Probabilistic Forecasting
of Intraday Electricity Prices [0.0]
We propose two approaches to constructing deep time series probabilistic models.
The first is where the output layer of the ESN has disturbances and a shrinkage prior for additional regularization.
The second approach employs the implicit copula of an ESN with Gaussian disturbances, which is a deep copula process on the feature space.
arXiv Detail & Related papers (2020-10-05T08:02:29Z) - Adversarial Attacks on Probabilistic Autoregressive Forecasting Models [7.305979446312823]
We develop an effective generation of adversarial attacks on neural models that output a sequence of probability distributions rather than a sequence of single values.
We demonstrate that our approach can successfully generate attacks with small input perturbations in two challenging tasks.
arXiv Detail & Related papers (2020-03-08T13:08:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.