Deep Latent State Space Models for Time-Series Generation
- URL: http://arxiv.org/abs/2212.12749v1
- Date: Sat, 24 Dec 2022 15:17:42 GMT
- Title: Deep Latent State Space Models for Time-Series Generation
- Authors: Linqi Zhou, Michael Poli, Winnie Xu, Stefano Massaroli, Stefano Ermon
- Abstract summary: We propose LS4, a generative model for sequences with latent variables evolving according to a state space ODE.
Inspired by recent deep state space models (S4), we achieve speedups by leveraging a convolutional representation of LS4.
We show that LS4 significantly outperforms previous continuous-time generative models in terms of marginal distribution, classification, and prediction scores on real-world datasets.
- Score: 68.45746489575032
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Methods based on ordinary differential equations (ODEs) are widely used to
build generative models of time-series. In addition to high computational
overhead due to explicitly computing hidden states recurrence, existing
ODE-based models fall short in learning sequence data with sharp transitions -
common in many real-world systems - due to numerical challenges during
optimization. In this work, we propose LS4, a generative model for sequences
with latent variables evolving according to a state space ODE to increase
modeling capacity. Inspired by recent deep state space models (S4), we achieve
speedups by leveraging a convolutional representation of LS4 which bypasses the
explicit evaluation of hidden states. We show that LS4 significantly
outperforms previous continuous-time generative models in terms of marginal
distribution, classification, and prediction scores on real-world datasets in
the Monash Forecasting Repository, and is capable of modeling highly stochastic
data with sharp temporal transitions. LS4 sets state-of-the-art for
continuous-time latent generative models, with significant improvement of mean
squared error and tighter variational lower bounds on irregularly-sampled
datasets, while also being x100 faster than other baselines on long sequences.
Related papers
- Functional Latent Dynamics for Irregularly Sampled Time Series Forecasting [5.359176539960004]
Irregularly sampled time series with missing values are often observed in multiple real-world applications such as healthcare, climate and astronomy.
We propose a family of models called Functional Latent Dynamics (FLD)
Instead of solving the Ordinary Differential Equation (ODE), we use simple curves which exist at all time points to specify the continuous latent state in the model.
arXiv Detail & Related papers (2024-05-06T15:53:55Z) - Convolutional State Space Models for Long-Range Spatiotemporal Modeling [65.0993000439043]
ConvS5 is an efficient variant for long-rangetemporal modeling.
It significantly outperforms Transformers and ConvNISTTM on a long horizon Moving-Lab experiment while training 3X faster than ConvLSTM and generating samples 400X faster than Transformers.
arXiv Detail & Related papers (2023-10-30T16:11:06Z) - SeqLink: A Robust Neural-ODE Architecture for Modelling Partially Observed Time Series [11.261457967759688]
We introduce SeqLink, an innovative neural architecture designed to enhance the robustness of sequence representation.
We demonstrate that SeqLink improves the modelling of intermittent time series, consistently outperforming state-of-the-art approaches.
arXiv Detail & Related papers (2022-12-07T10:25:59Z) - Liquid Structural State-Space Models [106.74783377913433]
Liquid-S4 achieves an average performance of 87.32% on the Long-Range Arena benchmark.
On the full raw Speech Command recognition, dataset Liquid-S4 achieves 96.78% accuracy with a 30% reduction in parameter counts compared to S4.
arXiv Detail & Related papers (2022-09-26T18:37:13Z) - Time-series Transformer Generative Adversarial Networks [5.254093731341154]
We consider limitations posed specifically on time-series data and present a model that can generate synthetic time-series.
A model that generates synthetic time-series data has two objectives: 1) to capture the stepwise conditional distribution of real sequences, and 2) to faithfully model the joint distribution of entire real sequences.
We present TsT-GAN, a framework that capitalises on the Transformer architecture to satisfy the desiderata and compare its performance against five state-of-the-art models on five datasets.
arXiv Detail & Related papers (2022-05-23T10:04:21Z) - Deep Generative model with Hierarchical Latent Factors for Time Series
Anomaly Detection [40.21502451136054]
This work presents DGHL, a new family of generative models for time series anomaly detection.
A top-down Convolution Network maps a novel hierarchical latent space to time series windows, exploiting temporal dynamics to encode information efficiently.
Our method outperformed current state-of-the-art models on four popular benchmark datasets.
arXiv Detail & Related papers (2022-02-15T17:19:44Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Variational Hyper RNN for Sequence Modeling [69.0659591456772]
We propose a novel probabilistic sequence model that excels at capturing high variability in time series data.
Our method uses temporal latent variables to capture information about the underlying data pattern.
The efficacy of the proposed method is demonstrated on a range of synthetic and real-world sequential data.
arXiv Detail & Related papers (2020-02-24T19:30:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.