Partially Hidden Markov Chain Linear Autoregressive model: inference and
forecasting
- URL: http://arxiv.org/abs/2102.12584v1
- Date: Wed, 24 Feb 2021 22:12:05 GMT
- Title: Partially Hidden Markov Chain Linear Autoregressive model: inference and
forecasting
- Authors: Fatoumata Dama and Christine Sinoquet
- Abstract summary: Time series subject to change in regime have attracted much interest in domains such as econometry, finance or meteorology.
We present a novel model which addresses the intermediate case: (i) state processes associated to such time series are modelled by Partially Hidden Markov Chains (PHMCs)
We propose a hidden state inference procedure and a forecasting function that take into account the observed states when existing.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Time series subject to change in regime have attracted much interest in
domains such as econometry, finance or meteorology. For discrete-valued
regimes, some models such as the popular Hidden Markov Chain (HMC) describe
time series whose state process is unknown at all time-steps. Sometimes, time
series are firstly labelled thanks to some annotation function. Thus, another
category of models handles the case with regimes observed at all time-steps. We
present a novel model which addresses the intermediate case: (i) state
processes associated to such time series are modelled by Partially Hidden
Markov Chains (PHMCs); (ii) a linear autoregressive (LAR) model drives the
dynamics of the time series, within each regime. We describe a variant of the
expection maximization (EM) algorithm devoted to PHMC-LAR model learning. We
propose a hidden state inference procedure and a forecasting function that take
into account the observed states when existing. We assess inference and
prediction performances, and analyze EM convergence times for the new model,
using simulated data. We show the benefits of using partially observed states
to decrease EM convergence times. A fully labelled scheme with unreliable
labels also speeds up EM. This offers promising prospects to enhance PHMC-LAR
model selection. We also point out the robustness of PHMC-LAR to labelling
errors in inference task, when large training datasets and moderate labelling
error rates are considered. Finally, we highlight the remarkable robustness to
error labelling in the prediction task, over the whole range of error rates.
Related papers
- Amortized Control of Continuous State Space Feynman-Kac Model for Irregular Time Series [14.400596021890863]
Many real-world datasets, such as healthcare, climate, and economics, are often collected as irregular time series.
We propose the Amortized Control of continuous State Space Model (ACSSM) for continuous dynamical modeling of time series.
arXiv Detail & Related papers (2024-10-08T01:27:46Z) - Graph Spatiotemporal Process for Multivariate Time Series Anomaly
Detection with Missing Values [67.76168547245237]
We introduce a novel framework called GST-Pro, which utilizes a graphtemporal process and anomaly scorer to detect anomalies.
Our experimental results show that the GST-Pro method can effectively detect anomalies in time series data and outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-01-11T10:10:16Z) - ChiroDiff: Modelling chirographic data with Diffusion Models [132.5223191478268]
We introduce a powerful model-class namely "Denoising Diffusion Probabilistic Models" or DDPMs for chirographic data.
Our model named "ChiroDiff", being non-autoregressive, learns to capture holistic concepts and therefore remains resilient to higher temporal sampling rate.
arXiv Detail & Related papers (2023-04-07T15:17:48Z) - Learning to Reconstruct Missing Data from Spatiotemporal Graphs with
Sparse Observations [11.486068333583216]
This paper tackles the problem of learning effective models to reconstruct missing data points.
We propose a class of attention-based architectures, that given a set of highly sparse observations, learn a representation for points in time and space.
Compared to the state of the art, our model handles sparse data without propagating prediction errors or requiring a bidirectional model to encode forward and backward time dependencies.
arXiv Detail & Related papers (2022-05-26T16:40:48Z) - Markov Chain Monte Carlo for Continuous-Time Switching Dynamical Systems [26.744964200606784]
We propose a novel inference algorithm utilizing a Markov Chain Monte Carlo approach.
The presented Gibbs sampler allows to efficiently obtain samples from the exact continuous-time posterior processes.
arXiv Detail & Related papers (2022-05-18T09:03:00Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Time Adaptive Gaussian Model [0.913755431537592]
Our model is a generalization of state-of-the-art methods for the inference of temporal graphical models.
It performs pattern recognition by clustering data points in time; and, it finds probabilistic (and possibly causal) relationships among the observed variables.
arXiv Detail & Related papers (2021-02-02T00:28:14Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Comparative Analysis of the Hidden Markov Model and LSTM: A Simulative
Approach [0.0]
We show that a hidden Markov model can still be an effective method to process the sequence data even when the first-order Markov assumption is not satisfied.
Our results indicate that even an unsupervised hidden Markov model can outperform LSTM when a massive amount of labeled data is not available.
arXiv Detail & Related papers (2020-08-09T22:13:10Z) - Targeted stochastic gradient Markov chain Monte Carlo for hidden Markov models with rare latent states [48.705095800341944]
Markov chain Monte Carlo (MCMC) algorithms for hidden Markov models often rely on the forward-backward sampler.
This makes them computationally slow as the length of the time series increases, motivating the development of sub-sampling-based approaches.
We propose a targeted sub-sampling approach that over-samples observations corresponding to rare latent states when calculating the gradient of parameters associated with them.
arXiv Detail & Related papers (2018-10-31T17:44:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.