Global Models for Time Series Forecasting: A Simulation Study
- URL: http://arxiv.org/abs/2012.12485v3
- Date: Mon, 22 Mar 2021 03:39:03 GMT
- Title: Global Models for Time Series Forecasting: A Simulation Study
- Authors: Hansika Hewamalage, Christoph Bergmeir, Kasun Bandara
- Abstract summary: We simulate time series from simple data generating processes (DGP), such as Auto Regressive (AR) and Seasonal AR, to complex DGPs, such as Chaotic Logistic Map, Self-Exciting Threshold Auto-Regressive, and Mackey-Glass equations.
The lengths and the number of series in the dataset are varied in different scenarios.
We perform experiments on these datasets using global forecasting models including Recurrent Neural Networks (RNN), Feed-Forward Neural Networks, Pooled Regression (PR) models, and Light Gradient Boosting Models (LGBM)
- Score: 2.580765958706854
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In the current context of Big Data, the nature of many forecasting problems
has changed from predicting isolated time series to predicting many time series
from similar sources. This has opened up the opportunity to develop competitive
global forecasting models that simultaneously learn from many time series. But,
it still remains unclear when global forecasting models can outperform the
univariate benchmarks, especially along the dimensions of the
homogeneity/heterogeneity of series, the complexity of patterns in the series,
the complexity of forecasting models, and the lengths/number of series. Our
study attempts to address this problem through investigating the effect from
these factors, by simulating a number of datasets that have controllable time
series characteristics. Specifically, we simulate time series from simple data
generating processes (DGP), such as Auto Regressive (AR) and Seasonal AR, to
complex DGPs, such as Chaotic Logistic Map, Self-Exciting Threshold
Auto-Regressive, and Mackey-Glass Equations. The data heterogeneity is
introduced by mixing time series generated from several DGPs into a single
dataset. The lengths and the number of series in the dataset are varied in
different scenarios. We perform experiments on these datasets using global
forecasting models including Recurrent Neural Networks (RNN), Feed-Forward
Neural Networks, Pooled Regression (PR) models and Light Gradient Boosting
Models (LGBM), and compare their performance against standard statistical
univariate forecasting techniques. Our experiments demonstrate that when
trained as global forecasting models, techniques such as RNNs and LGBMs, which
have complex non-linear modelling capabilities, are competitive methods in
general under challenging forecasting scenarios such as series having short
lengths, datasets with heterogeneous series and having minimal prior knowledge
of the patterns of the series.
Related papers
- GinAR: An End-To-End Multivariate Time Series Forecasting Model Suitable for Variable Missing [21.980379175333443]
We propose a novel Graph Interpolation Attention Recursive Network (named GinAR) to model the spatial-temporal dependencies over the limited collected data for forecasting.
In GinAR, it consists of two key components, that is, attention and adaptive graph convolution.
Experiments conducted on five real-world datasets demonstrate that GinAR outperforms 11 SOTA baselines, and even when 90% of variables are missing, it can still accurately predict the future values of all variables.
arXiv Detail & Related papers (2024-05-18T16:42:44Z) - Context Neural Networks: A Scalable Multivariate Model for Time Series Forecasting [5.5711773076846365]
Real-world time series often exhibit complex interdependencies that cannot be captured in isolation.
This paper introduces the Context Neural Network, an efficient linear complexity approach for augmenting time series models with relevant contextual insights.
arXiv Detail & Related papers (2024-05-12T00:21:57Z) - Time Series Data Augmentation as an Imbalanced Learning Problem [2.5536554335016417]
We use oversampling strategies to create synthetic time series observations and improve the accuracy of forecasting models.
We carried out experiments using 7 different databases that contain a total of 5502 univariate time series.
We found that the proposed solution outperforms both a global and a local model, thus providing a better trade-off between these two approaches.
arXiv Detail & Related papers (2024-04-29T09:27:15Z) - Unified Training of Universal Time Series Forecasting Transformers [104.56318980466742]
We present a Masked-based Universal Time Series Forecasting Transformer (Moirai)
Moirai is trained on our newly introduced Large-scale Open Time Series Archive (LOTSA) featuring over 27B observations across nine domains.
Moirai achieves competitive or superior performance as a zero-shot forecaster when compared to full-shot models.
arXiv Detail & Related papers (2024-02-04T20:00:45Z) - Timer: Generative Pre-trained Transformers Are Large Time Series Models [83.03091523806668]
This paper aims at the early development of large time series models (LTSM)
During pre-training, we curate large-scale datasets with up to 1 billion time points.
To meet diverse application needs, we convert forecasting, imputation, and anomaly detection of time series into a unified generative task.
arXiv Detail & Related papers (2024-02-04T06:55:55Z) - Lag-Llama: Towards Foundation Models for Probabilistic Time Series
Forecasting [54.04430089029033]
We present Lag-Llama, a general-purpose foundation model for time series forecasting based on a decoder-only transformer architecture.
Lag-Llama is pretrained on a large corpus of diverse time series data from several domains, and demonstrates strong zero-shot generalization capabilities.
When fine-tuned on relatively small fractions of such previously unseen datasets, Lag-Llama achieves state-of-the-art performance.
arXiv Detail & Related papers (2023-10-12T12:29:32Z) - G-NM: A Group of Numerical Time Series Prediction Models [0.0]
Group of Numerical Time Series Prediction Model (G-NM) encapsulates both linear and non-linear dependencies, seasonalities, and trends present in time series data.
G-NM is explicitly constructed to augment our predictive capabilities related to patterns and trends inherent in complex natural phenomena.
arXiv Detail & Related papers (2023-06-20T16:39:27Z) - Multi-scale Attention Flow for Probabilistic Time Series Forecasting [68.20798558048678]
We propose a novel non-autoregressive deep learning model, called Multi-scale Attention Normalizing Flow(MANF)
Our model avoids the influence of cumulative error and does not increase the time complexity.
Our model achieves state-of-the-art performance on many popular multivariate datasets.
arXiv Detail & Related papers (2022-05-16T07:53:42Z) - Deep Autoregressive Models with Spectral Attention [74.08846528440024]
We propose a forecasting architecture that combines deep autoregressive models with a Spectral Attention (SA) module.
By characterizing in the spectral domain the embedding of the time series as occurrences of a random process, our method can identify global trends and seasonality patterns.
Two spectral attention models, global and local to the time series, integrate this information within the forecast and perform spectral filtering to remove time series's noise.
arXiv Detail & Related papers (2021-07-13T11:08:47Z) - Synergetic Learning of Heterogeneous Temporal Sequences for
Multi-Horizon Probabilistic Forecasting [48.8617204809538]
We propose Variational Synergetic Multi-Horizon Network (VSMHN), a novel deep conditional generative model.
To learn complex correlations across heterogeneous sequences, a tailored encoder is devised to combine the advances in deep point processes models and variational recurrent neural networks.
Our model can be trained effectively using variational inference and generates predictions with Monte-Carlo simulation.
arXiv Detail & Related papers (2021-01-31T11:00:55Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.