Structural Knowledge Informed Continual Multivariate Time Series
Forecasting
- URL: http://arxiv.org/abs/2402.12722v1
- Date: Tue, 20 Feb 2024 05:11:20 GMT
- Title: Structural Knowledge Informed Continual Multivariate Time Series
Forecasting
- Authors: Zijie Pan, Yushan Jiang, Dongjin Song, Sahil Garg, Kashif Rasul,
Anderson Schneider, Yuriy Nevmyvaka
- Abstract summary: We propose a novel Structural Knowledge Informed Continual Learning (SKI-CL) framework to perform MTS forecasting within a continual learning paradigm.
Specifically, we develop a forecasting model based on graph structure learning, where a consistency regularization scheme is imposed between the learned variable dependencies and the structural knowledge.
We develop a representation-matching memory replay scheme that maximizes the temporal coverage of MTS data to efficiently preserve the underlying temporal dynamics and dependency structures of each regime.
- Score: 23.18105409644709
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent studies in multivariate time series (MTS) forecasting reveal that
explicitly modeling the hidden dependencies among different time series can
yield promising forecasting performance and reliable explanations. However,
modeling variable dependencies remains underexplored when MTS is continuously
accumulated under different regimes (stages). Due to the potential distribution
and dependency disparities, the underlying model may encounter the catastrophic
forgetting problem, i.e., it is challenging to memorize and infer different
types of variable dependencies across different regimes while maintaining
forecasting performance. To address this issue, we propose a novel Structural
Knowledge Informed Continual Learning (SKI-CL) framework to perform MTS
forecasting within a continual learning paradigm, which leverages structural
knowledge to steer the forecasting model toward identifying and adapting to
different regimes, and selects representative MTS samples from each regime for
memory replay. Specifically, we develop a forecasting model based on graph
structure learning, where a consistency regularization scheme is imposed
between the learned variable dependencies and the structural knowledge while
optimizing the forecasting objective over the MTS data. As such, MTS
representations learned in each regime are associated with distinct structural
knowledge, which helps the model memorize a variety of conceivable scenarios
and results in accurate forecasts in the continual learning context. Meanwhile,
we develop a representation-matching memory replay scheme that maximizes the
temporal coverage of MTS data to efficiently preserve the underlying temporal
dynamics and dependency structures of each regime. Thorough empirical studies
on synthetic and real-world benchmarks validate SKI-CL's efficacy and
advantages over the state-of-the-art for continual MTS forecasting tasks.
Related papers
- DisenTS: Disentangled Channel Evolving Pattern Modeling for Multivariate Time Series Forecasting [43.071713191702486]
DisenTS is a tailored framework for modeling disentangled channel evolving patterns in general time series forecasting.
We introduce a novel Forecaster Aware Gate (FAG) module that generates the routing signals adaptively according to both the forecasters' states and input series' characteristics.
arXiv Detail & Related papers (2024-10-30T12:46:14Z) - Multi-Knowledge Fusion Network for Time Series Representation Learning [2.368662284133926]
We propose a hybrid architecture that combines prior knowledge with implicit knowledge of the relational structure within the MTS data.
The proposed architecture has shown promising results on multiple benchmark datasets and outperforms state-of-the-art forecasting methods by a significant margin.
arXiv Detail & Related papers (2024-08-22T14:18:16Z) - Multi-Source Knowledge-Based Hybrid Neural Framework for Time Series Representation Learning [2.368662284133926]
The proposed hybrid architecture addresses limitations by combining both domain-specific knowledge and implicit knowledge of the relational structure underlying the MTS data.
The architecture shows promising results on multiple benchmark datasets, outperforming state-of-the-art forecasting methods.
arXiv Detail & Related papers (2024-08-22T13:58:55Z) - Multi-Modality Spatio-Temporal Forecasting via Self-Supervised Learning [11.19088022423885]
We propose a novel MoST learning framework via Self-Supervised Learning, namely MoSSL.
Results on two real-world MoST datasets verify the superiority of our approach compared with the state-of-the-art baselines.
arXiv Detail & Related papers (2024-05-06T08:24:06Z) - Causal Temporal Regime Structure Learning [49.77103348208835]
We introduce a new optimization-based method (linear) that concurrently learns the Directed Acyclic Graph (DAG) for each regime.
We conduct extensive experiments and show that our method consistently outperforms causal discovery models across various settings.
arXiv Detail & Related papers (2023-11-02T17:26:49Z) - OpenSTL: A Comprehensive Benchmark of Spatio-Temporal Predictive
Learning [67.07363529640784]
We propose OpenSTL to categorize prevalent approaches into recurrent-based and recurrent-free models.
We conduct standard evaluations on datasets across various domains, including synthetic moving object trajectory, human motion, driving scenes, traffic flow and forecasting weather.
We find that recurrent-free models achieve a good balance between efficiency and performance than recurrent models.
arXiv Detail & Related papers (2023-06-20T03:02:14Z) - Disentangling Structured Components: Towards Adaptive, Interpretable and
Scalable Time Series Forecasting [52.47493322446537]
We develop a adaptive, interpretable and scalable forecasting framework, which seeks to individually model each component of the spatial-temporal patterns.
SCNN works with a pre-defined generative process of MTS, which arithmetically characterizes the latent structure of the spatial-temporal patterns.
Extensive experiments are conducted to demonstrate that SCNN can achieve superior performance over state-of-the-art models on three real-world datasets.
arXiv Detail & Related papers (2023-05-22T13:39:44Z) - The Capacity and Robustness Trade-off: Revisiting the Channel
Independent Strategy for Multivariate Time Series Forecasting [50.48888534815361]
We show that models trained with the Channel Independent (CI) strategy outperform those trained with the Channel Dependent (CD) strategy.
Our results conclude that the CD approach has higher capacity but often lacks robustness to accurately predict distributionally drifted time series.
We propose a modified CD method called Predict Residuals with Regularization (PRReg) that can surpass the CI strategy.
arXiv Detail & Related papers (2023-04-11T13:15:33Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Meta-learning framework with applications to zero-shot time-series
forecasting [82.61728230984099]
This work provides positive evidence using a broad meta-learning framework.
residual connections act as a meta-learning adaptation mechanism.
We show that it is viable to train a neural network on a source TS dataset and deploy it on a different target TS dataset without retraining.
arXiv Detail & Related papers (2020-02-07T16:39:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.