Two-Step Meta-Learning for Time-Series Forecasting Ensemble
- URL: http://arxiv.org/abs/2011.10545v2
- Date: Fri, 15 Apr 2022 10:07:10 GMT
- Title: Two-Step Meta-Learning for Time-Series Forecasting Ensemble
- Authors: Evaldas Vaiciukynas, Paulius Danenas, Vilius Kontrimas, Rimantas
Butleris
- Abstract summary: forecasting using an ensemble of several methods is often seen as a compromise.
We propose to predict these aspects adaptively using meta-learning.
The proposed approach was tested on 12561 micro-economic time-series.
- Score: 1.1278903078792915
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Amounts of historical data collected increase and business intelligence
applicability with automatic forecasting of time series are in high demand.
While no single time series modeling method is universal to all types of
dynamics, forecasting using an ensemble of several methods is often seen as a
compromise. Instead of fixing ensemble diversity and size, we propose to
predict these aspects adaptively using meta-learning. Meta-learning here
considers two separate random forest regression models, built on 390
time-series features, to rank 22 univariate forecasting methods and recommend
ensemble size. The forecasting ensemble is consequently formed from methods
ranked as the best, and forecasts are pooled using either simple or weighted
average (with a weight corresponding to reciprocal rank). The proposed approach
was tested on 12561 micro-economic time-series (expanded to 38633 for various
forecasting horizons) of M4 competition where meta-learning outperformed Theta
and Comb benchmarks by relative forecasting errors for all data types and
horizons. Best overall results were achieved by weighted pooling with a
symmetric mean absolute percentage error of 9.21% versus 11.05% obtained using
the Theta method.
Related papers
- GIFT-Eval: A Benchmark For General Time Series Forecasting Model Evaluation [90.53485251837235]
Time series foundation models excel in zero-shot forecasting, handling diverse tasks without explicit training.
GIFT-Eval is a pioneering benchmark aimed at promoting evaluation across diverse datasets.
GIFT-Eval encompasses 23 datasets over 144,000 time series and 177 million data points.
arXiv Detail & Related papers (2024-10-14T11:29:38Z) - Infinite forecast combinations based on Dirichlet process [9.326879672480413]
This paper introduces a deep learning ensemble forecasting model based on the Dirichlet process.
It offers substantial improvements in prediction accuracy and stability compared to a single benchmark model.
arXiv Detail & Related papers (2023-11-21T06:41:41Z) - Attention-Based Ensemble Pooling for Time Series Forecasting [55.2480439325792]
We propose a method for pooling that performs a weighted average over candidate model forecasts.
We test this method on two time-series forecasting problems: multi-step forecasting of the dynamics of the non-stationary Lorenz 63 equation, and one-step forecasting of the weekly incident deaths due to COVID-19.
arXiv Detail & Related papers (2023-10-24T22:59:56Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2023-10-17T20:30:16Z) - Beyond Ensemble Averages: Leveraging Climate Model Ensembles for Subseasonal Forecasting [10.083361616081874]
This study explores an application of machine learning (ML) models as post-processing tools for subseasonal forecasting.
Lagged numerical ensemble forecasts and observational data, including relative humidity, pressure at sea level, and geopotential height, are incorporated into various ML methods.
For regression, quantile regression, and tercile classification tasks, we consider using linear models, random forests, convolutional neural networks, and stacked models.
arXiv Detail & Related papers (2022-11-29T01:11:04Z) - When Rigidity Hurts: Soft Consistency Regularization for Probabilistic
Hierarchical Time Series Forecasting [69.30930115236228]
Probabilistic hierarchical time-series forecasting is an important variant of time-series forecasting.
Most methods focus on point predictions and do not provide well-calibrated probabilistic forecasts distributions.
We propose PROFHiT, a fully probabilistic hierarchical forecasting model that jointly models forecast distribution of entire hierarchy.
arXiv Detail & Related papers (2022-06-16T06:13:53Z) - Meta-Forecasting by combining Global DeepRepresentations with Local
Adaptation [12.747008878068314]
We introduce a novel forecasting method called Meta Global-Local Auto-Regression (Meta-GLAR)
It adapts to each time series by learning in closed-form the mapping from the representations produced by a recurrent neural network (RNN) to one-step-ahead forecasts.
Our method is competitive with the state-of-the-art in out-of-sample forecasting accuracy reported in earlier work.
arXiv Detail & Related papers (2021-11-05T11:45:02Z) - Cluster-and-Conquer: A Framework For Time-Series Forecasting [94.63501563413725]
We propose a three-stage framework for forecasting high-dimensional time-series data.
Our framework is highly general, allowing for any time-series forecasting and clustering method to be used in each step.
When instantiated with simple linear autoregressive models, we are able to achieve state-of-the-art results on several benchmark datasets.
arXiv Detail & Related papers (2021-10-26T20:41:19Z) - CAMul: Calibrated and Accurate Multi-view Time-Series Forecasting [70.54920804222031]
We propose a general probabilistic multi-view forecasting framework CAMul.
It can learn representations and uncertainty from diverse data sources.
It integrates the knowledge and uncertainty from each data view in a dynamic context-specific manner.
We show that CAMul outperforms other state-of-art probabilistic forecasting models by over 25% in accuracy and calibration.
arXiv Detail & Related papers (2021-09-15T17:13:47Z) - An Accurate and Fully-Automated Ensemble Model for Weekly Time Series
Forecasting [9.617563440471928]
We propose a forecasting method in this domain, leveraging state-of-the-art forecasting techniques.
We consider different meta-learning architectures, algorithms, and base model pools.
Our proposed method consistently outperforms a set of benchmarks and state-of-the-art weekly forecasting models.
arXiv Detail & Related papers (2020-10-16T04:29:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.