Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting
- URL: http://arxiv.org/abs/2102.00431v1
- Date: Sun, 31 Jan 2021 11:00:55 GMT
- Title: Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting
- Authors: Longyuan Li, Jihai Zhang, Junchi Yan, Yaohui Jin, Yunhao Zhang, Yanjie
Duan, and Guangjian Tian
- Abstract summary: We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
- Score: 48.8617204809538
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Time-series is ubiquitous across applications, such as transportation,
finance and healthcare. Time-series is often influenced by external factors,
especially in the form of asynchronous events, making forecasting difficult.
However, existing models are mainly designated for either synchronous
time-series or asynchronous event sequence, and can hardly provide a synthetic
way to capture the relation between them. We propose Variational Synergetic
Multi-Horizon Network (VSMHN), a novel deep conditional generative model. To
learn complex correlations across heterogeneous sequences, a tailored encoder
is devised to combine the advances in deep point processes models and
variational recurrent neural networks. In addition, an aligned time coding and
an auxiliary transition scheme are carefully devised for batched training on
unaligned sequences. Our model can be trained effectively using stochastic
variational inference and generates probabilistic predictions with Monte-Carlo
simulation. Furthermore, our model produces accurate, sharp and more realistic
probabilistic forecasts. We also show that modeling asynchronous event
sequences is crucial for multi-horizon time-series forecasting.
Related papers
- Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - Recurrent Interpolants for Probabilistic Time Series Prediction [10.422645245061899]
Sequential models like recurrent neural networks and transformers have become standard for probabilistic time series forecasting.
Recent work explores generative approaches using diffusion or flow-based models, extending to time series imputation and forecasting.
This work proposes a novel method combining recurrent neural networks' efficiency with diffusion models' probabilistic modeling, based on interpolants and conditional generation with control features.
arXiv Detail & Related papers (2024-09-18T03:52:48Z) - Interacting Diffusion Processes for Event Sequence Forecasting [20.380620709345898]
We introduce a novel approach that incorporates a diffusion generative model.
The model facilitates sequence-to-sequence prediction, allowing multi-step predictions based on historical event sequences.
We demonstrate that our proposal outperforms state-of-the-art baselines for long-horizon forecasting of TPP.
arXiv Detail & Related papers (2023-10-26T22:17:25Z) - Time Series Continuous Modeling for Imputation and Forecasting with Implicit Neural Representations [15.797295258800638]
We introduce a novel modeling approach for time series imputation and forecasting, tailored to address the challenges often encountered in real-world data.
Our method relies on a continuous-time-dependent model of the series' evolution dynamics.
A modulation mechanism, driven by a meta-learning algorithm, allows adaptation to unseen samples and extrapolation beyond observed time-windows.
arXiv Detail & Related papers (2023-06-09T13:20:04Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Predicting the State of Synchronization of Financial Time Series using
Cross Recurrence Plots [75.20174445166997]
This study introduces a new method for predicting the future state of synchronization of the dynamics of two financial time series.
We adopt a deep learning framework for methodologically addressing the prediction of the synchronization state.
We find that the task of predicting the state of synchronization of two time series is in general rather difficult, but for certain pairs of stocks attainable with very satisfactory performance.
arXiv Detail & Related papers (2022-10-26T10:22:28Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Model-Attentive Ensemble Learning for Sequence Modeling [86.4785354333566]
We present Model-Attentive Ensemble learning for Sequence modeling (MAES)
MAES is a mixture of time-series experts which leverages an attention-based gating mechanism to specialize the experts on different sequence dynamics and adaptively weight their predictions.
We demonstrate that MAES significantly out-performs popular sequence models on datasets subject to temporal shift.
arXiv Detail & Related papers (2021-02-23T05:23:35Z) - Temporal Latent Auto-Encoder: A Method for Probabilistic Multivariate
Time Series Forecasting [4.131842516813833]
We introduce a novel temporal latent auto-encoder method which enables nonlinear factorization of time series.
By imposing a probabilistic latent space model, complex distributions of the input series are modeled via the decoder.
Our model achieves state-of-the-art performance on many popular multivariate datasets, with gains sometimes as high as $50%$ for several standard metrics.
arXiv Detail & Related papers (2021-01-25T22:29:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.