Neural Continuous-Discrete State Space Models for Irregularly-Sampled
Time Series
- URL: http://arxiv.org/abs/2301.11308v3
- Date: Sun, 18 Jun 2023 13:20:29 GMT
- Title: Neural Continuous-Discrete State Space Models for Irregularly-Sampled
Time Series
- Authors: Abdul Fatir Ansari, Alvin Heng, Andre Lim, Harold Soh
- Abstract summary: NCDSSM employs auxiliary variables to disentangle recognition from dynamics, thus requiring amortized inference only for the auxiliary variables.
We propose three flexible parameterizations of the latent dynamics and an efficient training objective that marginalizes the dynamic states during inference.
Empirical results on multiple benchmark datasets show improved imputation and forecasting performance of NCDSSM over existing models.
- Score: 18.885471782270375
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning accurate predictive models of real-world dynamic phenomena (e.g.,
climate, biological) remains a challenging task. One key issue is that the data
generated by both natural and artificial processes often comprise time series
that are irregularly sampled and/or contain missing observations. In this work,
we propose the Neural Continuous-Discrete State Space Model (NCDSSM) for
continuous-time modeling of time series through discrete-time observations.
NCDSSM employs auxiliary variables to disentangle recognition from dynamics,
thus requiring amortized inference only for the auxiliary variables. Leveraging
techniques from continuous-discrete filtering theory, we demonstrate how to
perform accurate Bayesian inference for the dynamic states. We propose three
flexible parameterizations of the latent dynamics and an efficient training
objective that marginalizes the dynamic states during inference. Empirical
results on multiple benchmark datasets across various domains show improved
imputation and forecasting performance of NCDSSM over existing models.
Related papers
- Neural Persistence Dynamics [8.197801260302642]
We consider the problem of learning the dynamics in the topology of time-evolving point clouds.
Our proposed model -- staticitneural persistence dynamics -- substantially outperforms the state-of-the-art across a diverse set of parameter regression tasks.
arXiv Detail & Related papers (2024-05-24T17:20:18Z) - A Poisson-Gamma Dynamic Factor Model with Time-Varying Transition Dynamics [51.147876395589925]
A non-stationary PGDS is proposed to allow the underlying transition matrices to evolve over time.
A fully-conjugate and efficient Gibbs sampler is developed to perform posterior simulation.
Experiments show that, in comparison with related models, the proposed non-stationary PGDS achieves improved predictive performance.
arXiv Detail & Related papers (2024-02-26T04:39:01Z) - Synthetic location trajectory generation using categorical diffusion
models [50.809683239937584]
Diffusion models (DPMs) have rapidly evolved to be one of the predominant generative models for the simulation of synthetic data.
We propose using DPMs for the generation of synthetic individual location trajectories (ILTs) which are sequences of variables representing physical locations visited by individuals.
arXiv Detail & Related papers (2024-02-19T15:57:39Z) - Attractor Memory for Long-Term Time Series Forecasting: A Chaos Perspective [63.60312929416228]
textbftextitAttraos incorporates chaos theory into long-term time series forecasting.
We show that Attraos outperforms various LTSF methods on mainstream datasets and chaotic datasets with only one-twelfth of the parameters compared to PatchTST.
arXiv Detail & Related papers (2024-02-18T05:35:01Z) - Anamnesic Neural Differential Equations with Orthogonal Polynomial
Projections [6.345523830122166]
We propose PolyODE, a formulation that enforces long-range memory and preserves a global representation of the underlying dynamical system.
Our construction is backed by favourable theoretical guarantees and we demonstrate that it outperforms previous works in the reconstruction of past and future data.
arXiv Detail & Related papers (2023-03-03T10:49:09Z) - Learning continuous models for continuous physics [94.42705784823997]
We develop a test based on numerical analysis theory to validate machine learning models for science and engineering applications.
Our results illustrate how principled numerical analysis methods can be coupled with existing ML training/testing methodologies to validate models for science and engineering applications.
arXiv Detail & Related papers (2022-02-17T07:56:46Z) - Likelihood-Free Inference in State-Space Models with Unknown Dynamics [71.94716503075645]
We introduce a method for inferring and predicting latent states in state-space models where observations can only be simulated, and transition dynamics are unknown.
We propose a way of doing likelihood-free inference (LFI) of states and state prediction with a limited number of simulations.
arXiv Detail & Related papers (2021-11-02T12:33:42Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Deep Neural Dynamic Bayesian Networks applied to EEG sleep spindles
modeling [0.0]
We propose a generative model for single-channel EEG that incorporates the constraints experts actively enforce during visual scoring.
We derive algorithms for exact, tractable inference as a special case of Generalized Expectation Maximization.
We validate the model on three public datasets and provide support that more complex models are able to surpass state-of-the-art detectors.
arXiv Detail & Related papers (2020-10-16T21:48:29Z) - Learning Continuous-Time Dynamics by Stochastic Differential Networks [32.63114111531396]
We propose a flexible continuous-time recurrent neural network named Variational Differential Networks (VSDN)
VSDN embeds the complicated dynamics of the sporadic time series by neural Differential Equations (SDE)
We show that VSDNs outperform state-of-the-art continuous-time deep learning models and achieve remarkable performance on prediction and tasks for sporadic time series.
arXiv Detail & Related papers (2020-06-11T01:40:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.