Negative-Binomial Randomized Gamma Markov Processes for Heterogeneous
Overdispersed Count Time Series
- URL: http://arxiv.org/abs/2402.18995v1
- Date: Thu, 29 Feb 2024 09:46:47 GMT
- Title: Negative-Binomial Randomized Gamma Markov Processes for Heterogeneous
Overdispersed Count Time Series
- Authors: Rui Huang, Sikun Yang, Heinz Koeppl
- Abstract summary: We propose a negative-binomial-randomized gamma Markov process, which improves the predictive performance of the proposed dynamical system.
We also develop methods to estimate both factor-structured and graph-structured transition dynamics, which enable us to infer more explainable latent structure.
- Score: 34.31866715010829
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Modeling count-valued time series has been receiving increasing attention
since count time series naturally arise in physical and social domains. Poisson
gamma dynamical systems (PGDSs) are newly-developed methods, which can well
capture the expressive latent transition structure and bursty dynamics behind
count sequences. In particular, PGDSs demonstrate superior performance in terms
of data imputation and prediction, compared with canonical linear dynamical
system (LDS) based methods. Despite these advantages, PGDS cannot capture the
heterogeneous overdispersed behaviours of the underlying dynamic processes. To
mitigate this defect, we propose a negative-binomial-randomized gamma Markov
process, which not only significantly improves the predictive performance of
the proposed dynamical system, but also facilitates the fast convergence of the
inference algorithm. Moreover, we develop methods to estimate both
factor-structured and graph-structured transition dynamics, which enable us to
infer more explainable latent structure, compared with PGDSs. Finally, we
demonstrate the explainable latent structure learned by the proposed method,
and show its superior performance in imputing missing data and forecasting
future observations, compared with the related models.
Related papers
- A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - Causal Graph Discovery from Self and Mutually Exciting Time Series [10.410454851418548]
We develop a non-asymptotic recovery guarantee and quantifiable uncertainty by solving a linear program.
We demonstrate the effectiveness of our approach in recovering highly interpretable causal DAGs over Sepsis Associated Derangements (SADs)
arXiv Detail & Related papers (2023-01-26T16:15:27Z) - Deep Explicit Duration Switching Models for Time Series [84.33678003781908]
We propose a flexible model that is capable of identifying both state- and time-dependent switching dynamics.
State-dependent switching is enabled by a recurrent state-to-switch connection.
An explicit duration count variable is used to improve the time-dependent switching behavior.
arXiv Detail & Related papers (2021-10-26T17:35:21Z) - Deep Probabilistic Time Series Forecasting using Augmented Recurrent
Input for Dynamic Systems [12.319812075685956]
We combine the advances in both deep generative models and state space model (SSM) to come up with a novel, data-driven deep probabilistic sequence model.
Specially, we follow the popular encoder-decoder generative structure to build the recurrent neural networks (RNN) assisted variational sequence model.
In order to alleviate the issue of inconsistency between training and predicting, we (i) propose using a hybrid output as input at next time step, which brings training and predicting into alignment.
arXiv Detail & Related papers (2021-06-03T23:41:11Z) - Dynamic Gaussian Mixture based Deep Generative Model For Robust
Forecasting on Sparse Multivariate Time Series [43.86737761236125]
We propose a novel generative model, which tracks the transition of latent clusters, instead of isolated feature representations.
It is characterized by a newly designed dynamic Gaussian mixture distribution, which captures the dynamics of clustering structures.
A structured inference network is also designed for enabling inductive analysis.
arXiv Detail & Related papers (2021-03-03T04:10:07Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Multiplicative noise and heavy tails in stochastic optimization [62.993432503309485]
empirical optimization is central to modern machine learning, but its role in its success is still unclear.
We show that it commonly arises in parameters of discrete multiplicative noise due to variance.
A detailed analysis is conducted in which we describe on key factors, including recent step size, and data, all exhibit similar results on state-of-the-art neural network models.
arXiv Detail & Related papers (2020-06-11T09:58:01Z) - Variational Hyper RNN for Sequence Modeling [69.0659591456772]
We propose a novel probabilistic sequence model that excels at capturing high variability in time series data.
Our method uses temporal latent variables to capture information about the underlying data pattern.
The efficacy of the proposed method is demonstrated on a range of synthetic and real-world sequential data.
arXiv Detail & Related papers (2020-02-24T19:30:32Z) - Supervised Learning for Non-Sequential Data: A Canonical Polyadic
Decomposition Approach [85.12934750565971]
Efficient modelling of feature interactions underpins supervised learning for non-sequential tasks.
To alleviate this issue, it has been proposed to implicitly represent the model parameters as a tensor.
For enhanced expressiveness, we generalize the framework to allow feature mapping to arbitrarily high-dimensional feature vectors.
arXiv Detail & Related papers (2020-01-27T22:38:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.