Variational latent discrete representation for time series modelling
- URL: http://arxiv.org/abs/2306.15282v3
- Date: Wed, 16 Aug 2023 08:53:45 GMT
- Title: Variational latent discrete representation for time series modelling
- Authors: Max Cohen (IP Paris, TSP - ARTEMIS, ARMEDIA-SAMOVAR), Maurice Charbit,
Sylvain Le Corff (IP Paris, TSP - CITI, ISTeC-SAMOVAR)
- Abstract summary: We introduce a latent data model where the discrete state is a Markov chain, which allows fast end-to-end training.
The performance of our generative model is assessed on a building management dataset and on the publicly available Electricity Transformer dataset.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Discrete latent space models have recently achieved performance on par with
their continuous counterparts in deep variational inference. While they still
face various implementation challenges, these models offer the opportunity for
a better interpretation of latent spaces, as well as a more direct
representation of naturally discrete phenomena. Most recent approaches propose
to train separately very high-dimensional prior models on the discrete latent
data which is a challenging task on its own. In this paper, we introduce a
latent data model where the discrete state is a Markov chain, which allows fast
end-to-end training. The performance of our generative model is assessed on a
building management dataset and on the publicly available Electricity
Transformer Dataset.
Related papers
- Stochastic Diffusion: A Diffusion Probabilistic Model for Stochastic Time Series Forecasting [8.232475807691255]
We propose a novel Diffusion (StochDiff) model which learns data-driven prior knowledge at each time step.
The learnt prior knowledge helps the model to capture complex temporal dynamics and the inherent uncertainty of the data.
arXiv Detail & Related papers (2024-06-05T00:13:38Z) - Variational quantization for state space models [3.9762742923544456]
forecasting tasks using large datasets gathering thousands of heterogeneous time series is a crucial statistical problem in numerous sectors.
We propose a new forecasting model that combines discrete state space hidden Markov models with recent neural network architectures and training procedures inspired by vector quantized variational autoencoders.
We assess the performance of the proposed method using several datasets and show that it outperforms other state-of-the-art solutions.
arXiv Detail & Related papers (2024-04-17T07:01:41Z) - MG-TSD: Multi-Granularity Time Series Diffusion Models with Guided Learning Process [26.661721555671626]
We introduce a novel Multi-Granularity Time Series (MG-TSD) model, which achieves state-of-the-art predictive performance.
Our approach does not rely on additional external data, making it versatile and applicable across various domains.
arXiv Detail & Related papers (2024-03-09T01:15:03Z) - PDETime: Rethinking Long-Term Multivariate Time Series Forecasting from
the perspective of partial differential equations [49.80959046861793]
We present PDETime, a novel LMTF model inspired by the principles of Neural PDE solvers.
Our experimentation across seven diversetemporal real-world LMTF datasets reveals that PDETime adapts effectively to the intrinsic nature of the data.
arXiv Detail & Related papers (2024-02-25T17:39:44Z) - Synthetic location trajectory generation using categorical diffusion
models [50.809683239937584]
Diffusion models (DPMs) have rapidly evolved to be one of the predominant generative models for the simulation of synthetic data.
We propose using DPMs for the generation of synthetic individual location trajectories (ILTs) which are sequences of variables representing physical locations visited by individuals.
arXiv Detail & Related papers (2024-02-19T15:57:39Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Data-efficient Large Vision Models through Sequential Autoregression [58.26179273091461]
We develop an efficient, autoregression-based vision model on a limited dataset.
We demonstrate how this model achieves proficiency in a spectrum of visual tasks spanning both high-level and low-level semantic understanding.
Our empirical evaluations underscore the model's agility in adapting to various tasks, heralding a significant reduction in the parameter footprint.
arXiv Detail & Related papers (2024-02-07T13:41:53Z) - ChiroDiff: Modelling chirographic data with Diffusion Models [132.5223191478268]
We introduce a powerful model-class namely "Denoising Diffusion Probabilistic Models" or DDPMs for chirographic data.
Our model named "ChiroDiff", being non-autoregressive, learns to capture holistic concepts and therefore remains resilient to higher temporal sampling rate.
arXiv Detail & Related papers (2023-04-07T15:17:48Z) - Neural Continuous-Discrete State Space Models for Irregularly-Sampled
Time Series [18.885471782270375]
NCDSSM employs auxiliary variables to disentangle recognition from dynamics, thus requiring amortized inference only for the auxiliary variables.
We propose three flexible parameterizations of the latent dynamics and an efficient training objective that marginalizes the dynamic states during inference.
Empirical results on multiple benchmark datasets show improved imputation and forecasting performance of NCDSSM over existing models.
arXiv Detail & Related papers (2023-01-26T18:45:04Z) - DynaConF: Dynamic Forecasting of Non-Stationary Time Series [4.286546152336783]
We propose a new method to model non-stationary conditional distributions over time.
We show that our model can adapt to non-stationary time series better than state-of-the-art deep learning solutions.
arXiv Detail & Related papers (2022-09-17T21:40:02Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.