Transforming Autoregression: Interpretable and Expressive Time Series
Forecast
- URL: http://arxiv.org/abs/2110.08248v1
- Date: Fri, 15 Oct 2021 17:58:49 GMT
- Title: Transforming Autoregression: Interpretable and Expressive Time Series
Forecast
- Authors: David R\"ugamer, Philipp F.M. Baumann, Thomas Kneib, Torsten Hothorn
- Abstract summary: We propose Autoregressive Transformation Models (ATMs), a model class inspired from various research directions.
ATMs unite expressive distributional forecasts using a semi-parametric distribution assumption with an interpretable model specification.
We demonstrate the properties of ATMs both theoretically and through empirical evaluation on several simulated and real-world forecasting datasets.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Probabilistic forecasting of time series is an important matter in many
applications and research fields. In order to draw conclusions from a
probabilistic forecast, we must ensure that the model class used to approximate
the true forecasting distribution is expressive enough. Yet, characteristics of
the model itself, such as its uncertainty or its general functioning are not of
lesser importance. In this paper, we propose Autoregressive Transformation
Models (ATMs), a model class inspired from various research directions such as
normalizing flows and autoregressive models. ATMs unite expressive
distributional forecasts using a semi-parametric distribution assumption with
an interpretable model specification and allow for uncertainty quantification
based on (asymptotic) Maximum Likelihood theory. We demonstrate the properties
of ATMs both theoretically and through empirical evaluation on several
simulated and real-world forecasting datasets.
Related papers
- Exchangeable Sequence Models Can Naturally Quantify Uncertainty Over Latent Concepts [5.095571791233068]
We show that pre-trained sequence models are naturally capable of probabilistic reasoning over exchangeable data points.
A sequence model learns the relationship between observations, which differs from typical Bayesian models.
We show the sequence prediction loss controls the quality of uncertainty quantification.
arXiv Detail & Related papers (2024-08-06T17:16:10Z) - Towards Generalizable and Interpretable Motion Prediction: A Deep
Variational Bayes Approach [54.429396802848224]
This paper proposes an interpretable generative model for motion prediction with robust generalizability to out-of-distribution cases.
For interpretability, the model achieves the target-driven motion prediction by estimating the spatial distribution of long-term destinations.
Experiments on motion prediction datasets validate that the fitted model can be interpretable and generalizable.
arXiv Detail & Related papers (2024-03-10T04:16:04Z) - On the Efficient Marginalization of Probabilistic Sequence Models [3.5897534810405403]
This dissertation focuses on using autoregressive models to answer complex probabilistic queries.
We develop a class of novel and efficient approximation techniques for marginalization in sequential models that are model-agnostic.
arXiv Detail & Related papers (2024-03-06T19:29:08Z) - Predictive Churn with the Set of Good Models [64.05949860750235]
We study the effect of conflicting predictions over the set of near-optimal machine learning models.
We present theoretical results on the expected churn between models within the Rashomon set.
We show how our approach can be used to better anticipate, reduce, and avoid churn in consumer-facing applications.
arXiv Detail & Related papers (2024-02-12T16:15:25Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2023-10-17T20:30:16Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2022-06-16T06:13:53Z) - Distributional Gradient Boosting Machines [77.34726150561087]
Our framework is based on XGBoost and LightGBM.
We show that our framework achieves state-of-the-art forecast accuracy.
arXiv Detail & Related papers (2022-04-02T06:32:19Z) - Autoregressive Quantile Flows for Predictive Uncertainty Estimation [7.184701179854522]
We propose Autoregressive Quantile Flows, a flexible class of probabilistic models over high-dimensional variables.
These models are instances of autoregressive flows trained using a novel objective based on proper scoring rules.
arXiv Detail & Related papers (2021-12-09T01:11:26Z) - Learning Interpretable Deep State Space Model for Probabilistic Time
Series Forecasting [98.57851612518758]
Probabilistic time series forecasting involves estimating the distribution of future based on its history.
We propose a deep state space model for probabilistic time series forecasting whereby the non-linear emission model and transition model are parameterized by networks.
We show in experiments that our model produces accurate and sharp probabilistic forecasts.
arXiv Detail & Related papers (2021-01-31T06:49:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.