Two-stage hybrid models for enhancing forecasting accuracy on heterogeneous time series
- URL: http://arxiv.org/abs/2502.08600v1
- Date: Wed, 12 Feb 2025 17:39:02 GMT
- Title: Two-stage hybrid models for enhancing forecasting accuracy on heterogeneous time series
- Authors: Junru Ren, Shaomin Wu,
- Abstract summary: Compared to local models built in a series-by-series manner, global models leverage relevant information across time series.
The advantages of global models may not always be realized when dealing with heterogeneous data.
determining whether the time series data is homogeneous or heterogeneous can be ambiguous in practice.
This paper proposes two-stage hybrid models, which include a second stage to identify and model heterogeneous patterns.
- Score: 0.6506991840948217
- License:
- Abstract: Compared to local models built in a series-by-series manner, global models leverage relevant information across time series, resulting in improved forecasting performance and generalization capacity. Constructing global models on a set of time series is becoming mainstream in the field of time series forecasting. However, the advantages of global models may not always be realized when dealing with heterogeneous data. While they can adapt to heterogeneous datasets by increasing the model complexity, the model cannot be infinitely complex due to the finite sample size, which poses challenges for the application of global models. Additionally, determining whether the time series data is homogeneous or heterogeneous can be ambiguous in practice. To address these research gaps, this paper argues that the heterogeneity of the data should be defined by the global model used, and for each series, the portion not modelled by the global model represents heterogeneity. It further proposes two-stage hybrid models, which include a second stage to identify and model heterogeneous patterns. In this second stage, we can estimate either all local models or sub-global models across different domains divided based on heterogeneity. Experiments on four open datasets reveal that the proposed methods significantly outperform five existing models, indicating they contribute to fully unleash the potential of global models on heterogeneous datasets.
Related papers
- Learning Divergence Fields for Shift-Robust Graph Representations [73.11818515795761]
In this work, we propose a geometric diffusion model with learnable divergence fields for the challenging problem with interdependent data.
We derive a new learning objective through causal inference, which can guide the model to learn generalizable patterns of interdependence that are insensitive across domains.
arXiv Detail & Related papers (2024-06-07T14:29:21Z) - Task Groupings Regularization: Data-Free Meta-Learning with Heterogeneous Pre-trained Models [83.02797560769285]
Data-Free Meta-Learning (DFML) aims to derive knowledge from a collection of pre-trained models without accessing their original data.
Current methods often overlook the heterogeneity among pre-trained models, which leads to performance degradation due to task conflicts.
arXiv Detail & Related papers (2024-05-26T13:11:55Z) - Context Neural Networks: A Scalable Multivariate Model for Time Series Forecasting [5.5711773076846365]
Real-world time series often exhibit complex interdependencies that cannot be captured in isolation.
This paper introduces the Context Neural Network, an efficient linear complexity approach for augmenting time series models with relevant contextual insights.
arXiv Detail & Related papers (2024-05-12T00:21:57Z) - Dataless Knowledge Fusion by Merging Weights of Language Models [51.8162883997512]
Fine-tuning pre-trained language models has become the prevalent paradigm for building downstream NLP models.
This creates a barrier to fusing knowledge across individual models to yield a better single model.
We propose a dataless knowledge fusion method that merges models in their parameter space.
arXiv Detail & Related papers (2022-12-19T20:46:43Z) - Towards Understanding and Mitigating Dimensional Collapse in Heterogeneous Federated Learning [112.69497636932955]
Federated learning aims to train models across different clients without the sharing of data for privacy considerations.
We study how data heterogeneity affects the representations of the globally aggregated models.
We propose sc FedDecorr, a novel method that can effectively mitigate dimensional collapse in federated learning.
arXiv Detail & Related papers (2022-10-01T09:04:17Z) - Ensembles of Localised Models for Time Series Forecasting [7.199741890914579]
We study how ensembling techniques can be used with generic GFMs and univariate models to solve this issue.
Our work systematises and compares relevant current approaches, namely clustering series and training separate submodels per cluster.
We propose a new methodology of clustered ensembles where we train multiple GFMs on different clusters of series.
arXiv Detail & Related papers (2020-12-30T06:33:51Z) - Global Models for Time Series Forecasting: A Simulation Study [2.580765958706854]
We simulate time series from simple data generating processes (DGP), such as Auto Regressive (AR) and Seasonal AR, to complex DGPs, such as Chaotic Logistic Map, Self-Exciting Threshold Auto-Regressive, and Mackey-Glass equations.
The lengths and the number of series in the dataset are varied in different scenarios.
We perform experiments on these datasets using global forecasting models including Recurrent Neural Networks (RNN), Feed-Forward Neural Networks, Pooled Regression (PR) models, and Light Gradient Boosting Models (LGBM)
arXiv Detail & Related papers (2020-12-23T04:45:52Z) - Unsupervised Learning of Global Factors in Deep Generative Models [6.362733059568703]
We present a novel deep generative model based on non i.i.d. variational autoencoders.
We show that the model performs domain alignment to find correlations and interpolate between different databases.
We also study the ability of the global space to discriminate between groups of observations with non-trivial underlying structures.
arXiv Detail & Related papers (2020-12-15T11:55:31Z) - Improving the Reconstruction of Disentangled Representation Learners via Multi-Stage Modeling [54.94763543386523]
Current autoencoder-based disentangled representation learning methods achieve disentanglement by penalizing the ( aggregate) posterior to encourage statistical independence of the latent factors.
We present a novel multi-stage modeling approach where the disentangled factors are first learned using a penalty-based disentangled representation learning method.
Then, the low-quality reconstruction is improved with another deep generative model that is trained to model the missing correlated latent variables.
arXiv Detail & Related papers (2020-10-25T18:51:15Z) - Robust Finite Mixture Regression for Heterogeneous Targets [70.19798470463378]
We propose an FMR model that finds sample clusters and jointly models multiple incomplete mixed-type targets simultaneously.
We provide non-asymptotic oracle performance bounds for our model under a high-dimensional learning framework.
The results show that our model can achieve state-of-the-art performance.
arXiv Detail & Related papers (2020-10-12T03:27:07Z) - VAEM: a Deep Generative Model for Heterogeneous Mixed Type Data [16.00692074660383]
VAEM is a deep generative model that is trained in a two stage manner.
We show that VAEM broadens the range of real-world applications where deep generative models can be successfully deployed.
arXiv Detail & Related papers (2020-06-21T23:47:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.