A novel decomposed-ensemble time series forecasting framework: capturing
underlying volatility information
- URL: http://arxiv.org/abs/2310.08812v4
- Date: Wed, 29 Nov 2023 01:41:23 GMT
- Title: A novel decomposed-ensemble time series forecasting framework: capturing
underlying volatility information
- Authors: Zhengtao Gui, Haoyuan Li, Sijie Xu, Yu Chen
- Abstract summary: We propose a novel time series forecasting paradigm that integrates decomposition with the capability to capture the underlying fluctuation information of the series.
Both the numerical data and the volatility information for each sub-mode are harnessed to train a neural network.
This network is adept at predicting the information of the sub-modes, and we aggregate the predictions of all sub-modes to generate the final output.
- Score: 6.590038231008498
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Time series forecasting represents a significant and challenging task across
various fields. Recently, methods based on mode decomposition have dominated
the forecasting of complex time series because of the advantages of capturing
local characteristics and extracting intrinsic modes from data. Unfortunately,
most models fail to capture the implied volatilities that contain significant
information. To enhance the prediction of contemporary diverse and complex time
series, we propose a novel time series forecasting paradigm that integrates
decomposition with the capability to capture the underlying fluctuation
information of the series. In our methodology, we implement the Variational
Mode Decomposition algorithm to decompose the time series into K distinct
sub-modes. Following this decomposition, we apply the Generalized
Autoregressive Conditional Heteroskedasticity (GARCH) model to extract the
volatility information in these sub-modes. Subsequently, both the numerical
data and the volatility information for each sub-mode are harnessed to train a
neural network. This network is adept at predicting the information of the
sub-modes, and we aggregate the predictions of all sub-modes to generate the
final output. By integrating econometric and artificial intelligence methods,
and taking into account both the numerical and volatility information of the
time series, our proposed framework demonstrates superior performance in time
series forecasting, as evidenced by the significant decrease in MSE, RMSE, and
MAPE in our comparative experimental results.
Related papers
- Timer-XL: Long-Context Transformers for Unified Time Series Forecasting [67.83502953961505]
We present Timer-XL, a generative Transformer for unified time series forecasting.
Timer-XL achieves state-of-the-art performance across challenging forecasting benchmarks through a unified approach.
arXiv Detail & Related papers (2024-10-07T07:27:39Z) - StreamEnsemble: Predictive Queries over Spatiotemporal Streaming Data [0.8437187555622164]
We propose StreamEnembles, a novel approach to predictive queries overtemporal (ST) data distributions.
Our experimental evaluation reveals that this method markedly outperforms traditional ensemble methods and single model approaches in terms of accuracy and time.
arXiv Detail & Related papers (2024-09-30T23:50:16Z) - A Novel Method Combines Moving Fronts, Data Decomposition and Deep
Learning to Forecast Intricate Time Series [0.0]
Indian Summer Monsoon Rainfall (ISMR) is a very complex time series.
Conventional one-time decomposition technique suffers from a leak of information from the future.
Moving Front (MF) method is proposed to prevent data leakage.
arXiv Detail & Related papers (2023-03-11T12:07:26Z) - Ti-MAE: Self-Supervised Masked Time Series Autoencoders [16.98069693152999]
We propose a novel framework named Ti-MAE, in which the input time series are assumed to follow an integrate distribution.
Ti-MAE randomly masks out embedded time series data and learns an autoencoder to reconstruct them at the point-level.
Experiments on several public real-world datasets demonstrate that our framework of masked autoencoding could learn strong representations directly from the raw data.
arXiv Detail & Related papers (2023-01-21T03:20:23Z) - Generative Time Series Forecasting with Diffusion, Denoise, and
Disentanglement [51.55157852647306]
Time series forecasting has been a widely explored task of great importance in many applications.
It is common that real-world time series data are recorded in a short time period, which results in a big gap between the deep model and the limited and noisy time series.
We propose to address the time series forecasting problem with generative modeling and propose a bidirectional variational auto-encoder equipped with diffusion, denoise, and disentanglement.
arXiv Detail & Related papers (2023-01-08T12:20:46Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - TACTiS: Transformer-Attentional Copulas for Time Series [76.71406465526454]
estimation of time-varying quantities is a fundamental component of decision making in fields such as healthcare and finance.
We propose a versatile method that estimates joint distributions using an attention-based decoder.
We show that our model produces state-of-the-art predictions on several real-world datasets.
arXiv Detail & Related papers (2022-02-07T21:37:29Z) - Predicting Temporal Sets with Deep Neural Networks [50.53727580527024]
We propose an integrated solution based on the deep neural networks for temporal sets prediction.
A unique perspective is to learn element relationship by constructing set-level co-occurrence graph.
We design an attention-based module to adaptively learn the temporal dependency of elements and sets.
arXiv Detail & Related papers (2020-06-20T03:29:02Z) - Variational Hyper RNN for Sequence Modeling [69.0659591456772]
We propose a novel probabilistic sequence model that excels at capturing high variability in time series data.
Our method uses temporal latent variables to capture information about the underlying data pattern.
The efficacy of the proposed method is demonstrated on a range of synthetic and real-world sequential data.
arXiv Detail & Related papers (2020-02-24T19:30:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.