Dynamic Gaussian Mixture based Deep Generative Model For Robust
Forecasting on Sparse Multivariate Time Series
- URL: http://arxiv.org/abs/2103.02164v1
- Date: Wed, 3 Mar 2021 04:10:07 GMT
- Title: Dynamic Gaussian Mixture based Deep Generative Model For Robust
Forecasting on Sparse Multivariate Time Series
- Authors: Yinjun Wu, Jingchao Ni, Wei Cheng, Bo Zong, Dongjin Song, Zhengzhang
Chen, Yanchi Liu, Xuchao Zhang, Haifeng Chen, Susan Davidson
- Abstract summary: We propose a novel generative model, which tracks the transition of latent clusters, instead of isolated feature representations.
It is characterized by a newly designed dynamic Gaussian mixture distribution, which captures the dynamics of clustering structures.
A structured inference network is also designed for enabling inductive analysis.
- Score: 43.86737761236125
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Forecasting on sparse multivariate time series (MTS) aims to model the
predictors of future values of time series given their incomplete past, which
is important for many emerging applications. However, most existing methods
process MTS's individually, and do not leverage the dynamic distributions
underlying the MTS's, leading to sub-optimal results when the sparsity is high.
To address this challenge, we propose a novel generative model, which tracks
the transition of latent clusters, instead of isolated feature representations,
to achieve robust modeling. It is characterized by a newly designed dynamic
Gaussian mixture distribution, which captures the dynamics of clustering
structures, and is used for emitting timeseries. The generative model is
parameterized by neural networks. A structured inference network is also
designed for enabling inductive analysis. A gating mechanism is further
introduced to dynamically tune the Gaussian mixture distributions. Extensive
experimental results on a variety of real-life datasets demonstrate the
effectiveness of our method.
Related papers
- Trajectory Flow Matching with Applications to Clinical Time Series Modeling [77.58277281319253]
Trajectory Flow Matching (TFM) trains a Neural SDE in a simulation-free manner, bypassing backpropagation through the dynamics.
We demonstrate improved performance on three clinical time series datasets in terms of absolute performance and uncertainty prediction.
arXiv Detail & Related papers (2024-10-28T15:54:50Z) - Recurrent Interpolants for Probabilistic Time Series Prediction [10.422645245061899]
Sequential models like recurrent neural networks and transformers have become standard for probabilistic time series forecasting.
Recent work explores generative approaches using diffusion or flow-based models, extending to time series imputation and forecasting.
This work proposes a novel method combining recurrent neural networks' efficiency with diffusion models' probabilistic modeling, based on interpolants and conditional generation with control features.
arXiv Detail & Related papers (2024-09-18T03:52:48Z) - Contextually Enhanced ES-dRNN with Dynamic Attention for Short-Term Load
Forecasting [1.1602089225841632]
The proposed model is composed of two simultaneously trained tracks: the context track and the main track.
The RNN architecture consists of multiple recurrent layers stacked with hierarchical dilations and equipped with recently proposed attentive recurrent cells.
The model produces both point forecasts and predictive intervals.
arXiv Detail & Related papers (2022-12-18T07:42:48Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - ES-dRNN: A Hybrid Exponential Smoothing and Dilated Recurrent Neural
Network Model for Short-Term Load Forecasting [1.4502611532302039]
Short-term load forecasting (STLF) is challenging due to complex time series (TS)
This paper proposes a novel hybrid hierarchical deep learning model that deals with multiple seasonality.
It combines exponential smoothing (ES) and a recurrent neural network (RNN)
arXiv Detail & Related papers (2021-12-05T19:38:42Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Deep Probabilistic Time Series Forecasting using Augmented Recurrent
Input for Dynamic Systems [12.319812075685956]
We combine the advances in both deep generative models and state space model (SSM) to come up with a novel, data-driven deep probabilistic sequence model.
Specially, we follow the popular encoder-decoder generative structure to build the recurrent neural networks (RNN) assisted variational sequence model.
In order to alleviate the issue of inconsistency between training and predicting, we (i) propose using a hybrid output as input at next time step, which brings training and predicting into alignment.
arXiv Detail & Related papers (2021-06-03T23:41:11Z) - Anomaly Detection of Time Series with Smoothness-Inducing Sequential
Variational Auto-Encoder [59.69303945834122]
We present a Smoothness-Inducing Sequential Variational Auto-Encoder (SISVAE) model for robust estimation and anomaly detection of time series.
Our model parameterizes mean and variance for each time-stamp with flexible neural networks.
We show the effectiveness of our model on both synthetic datasets and public real-world benchmarks.
arXiv Detail & Related papers (2021-02-02T06:15:15Z) - Variational Dynamic Mixtures [18.730501689781214]
We develop variational dynamic mixtures (VDM) to infer sequential latent variables.
In an empirical study, we show that VDM outperforms competing approaches on highly multi-modal datasets.
arXiv Detail & Related papers (2020-10-20T16:10:07Z) - Variational Hyper RNN for Sequence Modeling [69.0659591456772]
We propose a novel probabilistic sequence model that excels at capturing high variability in time series data.
Our method uses temporal latent variables to capture information about the underlying data pattern.
The efficacy of the proposed method is demonstrated on a range of synthetic and real-world sequential data.
arXiv Detail & Related papers (2020-02-24T19:30:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.