Warped Dynamic Linear Models for Time Series of Counts
- URL: http://arxiv.org/abs/2110.14790v4
- Date: Tue, 6 Jun 2023 18:14:36 GMT
- Title: Warped Dynamic Linear Models for Time Series of Counts
- Authors: Brian King and Daniel R. Kowal
- Abstract summary: We introduce a novel semiparametric methodology for count time series by warping a Gaussian DLM.
We leverage these results to produce customized and efficient algorithms for inference and forecasting.
- Score: 1.3515965758160216
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Dynamic Linear Models (DLMs) are commonly employed for time series analysis
due to their versatile structure, simple recursive updating, ability to handle
missing data, and probabilistic forecasting. However, the options for count
time series are limited: Gaussian DLMs require continuous data, while
Poisson-based alternatives often lack sufficient modeling flexibility. We
introduce a novel semiparametric methodology for count time series by warping a
Gaussian DLM. The warping function has two components: a (nonparametric)
transformation operator that provides distributional flexibility and a rounding
operator that ensures the correct support for the discrete data-generating
process. We develop conjugate inference for the warped DLM, which enables
analytic and recursive updates for the state space filtering and smoothing
distributions. We leverage these results to produce customized and efficient
algorithms for inference and forecasting, including Monte Carlo simulation for
offline analysis and an optimal particle filter for online inference. This
framework unifies and extends a variety of discrete time series models and is
valid for natural counts, rounded values, and multivariate observations.
Simulation studies illustrate the excellent forecasting capabilities of the
warped DLM. The proposed approach is applied to a multivariate time series of
daily overdose counts and demonstrates both modeling and computational
successes.
Related papers
- A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - Time Series Continuous Modeling for Imputation and Forecasting with Implicit Neural Representations [15.797295258800638]
We introduce a novel modeling approach for time series imputation and forecasting, tailored to address the challenges often encountered in real-world data.
Our method relies on a continuous-time-dependent model of the series' evolution dynamics.
A modulation mechanism, driven by a meta-learning algorithm, allows adaptation to unseen samples and extrapolation beyond observed time-windows.
arXiv Detail & Related papers (2023-06-09T13:20:04Z) - Probabilistic Unrolling: Scalable, Inverse-Free Maximum Likelihood
Estimation for Latent Gaussian Models [69.22568644711113]
We introduce probabilistic unrolling, a method that combines Monte Carlo sampling with iterative linear solvers to circumvent matrix inversions.
Our theoretical analyses reveal that unrolling and backpropagation through the iterations of the solver can accelerate gradient estimation for maximum likelihood estimation.
In experiments on simulated and real data, we demonstrate that probabilistic unrolling learns latent Gaussian models up to an order of magnitude faster than gradient EM, with minimal losses in model performance.
arXiv Detail & Related papers (2023-06-05T21:08:34Z) - DynaConF: Dynamic Forecasting of Non-Stationary Time Series [4.286546152336783]
We propose a new method to model non-stationary conditional distributions over time.
We show that our model can adapt to non-stationary time series better than state-of-the-art deep learning solutions.
arXiv Detail & Related papers (2022-09-17T21:40:02Z) - Wasserstein multivariate auto-regressive models for modeling distributional time series [0.0]
We propose a new auto-regressive model for the statistical analysis of multivariate distributional time series.
Results on the existence, uniqueness and stationarity of the solution of such a model are provided.
To shed some light on the benefits of our approach for real data analysis, we also apply this methodology to a data set made of observations from age distribution in different countries.
arXiv Detail & Related papers (2022-07-12T10:18:36Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Temporal Latent Auto-Encoder: A Method for Probabilistic Multivariate
Time Series Forecasting [4.131842516813833]
We introduce a novel temporal latent auto-encoder method which enables nonlinear factorization of time series.
By imposing a probabilistic latent space model, complex distributions of the input series are modeled via the decoder.
Our model achieves state-of-the-art performance on many popular multivariate datasets, with gains sometimes as high as $50%$ for several standard metrics.
arXiv Detail & Related papers (2021-01-25T22:29:40Z) - Multivariate Probabilistic Time Series Forecasting via Conditioned
Normalizing Flows [8.859284959951204]
Time series forecasting is fundamental to scientific and engineering problems.
Deep learning methods are well suited for this problem.
We show that it improves over the state-of-the-art for standard metrics on many real-world data sets.
arXiv Detail & Related papers (2020-02-14T16:16:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.