Disentangling Structured Components: Towards Adaptive, Interpretable and
Scalable Time Series Forecasting
- URL: http://arxiv.org/abs/2305.13036v3
- Date: Thu, 15 Feb 2024 01:25:29 GMT
- Title: Disentangling Structured Components: Towards Adaptive, Interpretable and
Scalable Time Series Forecasting
- Authors: Jinliang Deng, Xiusi Chen, Renhe Jiang, Du Yin, Yi Yang, Xuan Song,
Ivor W. Tsang
- Abstract summary: We develop a adaptive, interpretable and scalable forecasting framework, which seeks to individually model each component of the spatial-temporal patterns.
SCNN works with a pre-defined generative process of MTS, which arithmetically characterizes the latent structure of the spatial-temporal patterns.
Extensive experiments are conducted to demonstrate that SCNN can achieve superior performance over state-of-the-art models on three real-world datasets.
- Score: 52.47493322446537
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multivariate time-series (MTS) forecasting is a paramount and fundamental
problem in many real-world applications. The core issue in MTS forecasting is
how to effectively model complex spatial-temporal patterns. In this paper, we
develop a adaptive, interpretable and scalable forecasting framework, which
seeks to individually model each component of the spatial-temporal patterns. We
name this framework SCNN, as an acronym of Structured Component-based Neural
Network. SCNN works with a pre-defined generative process of MTS, which
arithmetically characterizes the latent structure of the spatial-temporal
patterns. In line with its reverse process, SCNN decouples MTS data into
structured and heterogeneous components and then respectively extrapolates the
evolution of these components, the dynamics of which are more traceable and
predictable than the original MTS. Extensive experiments are conducted to
demonstrate that SCNN can achieve superior performance over state-of-the-art
models on three real-world datasets. Additionally, we examine SCNN with
different configurations and perform in-depth analyses of the properties of
SCNN.
Related papers
- Equivariant Spatio-Temporal Attentive Graph Networks to Simulate Physical Dynamics [32.115887916401036]
We develop an equivariant version of Fourier-temporal GNNs to represent and simulate dynamics of physical systems.
We evaluate our model on three real datasets corresponding to the molecular-, protein- and macro-level.
arXiv Detail & Related papers (2024-05-21T15:33:21Z) - Batch-Ensemble Stochastic Neural Networks for Out-of-Distribution
Detection [55.028065567756066]
Out-of-distribution (OOD) detection has recently received much attention from the machine learning community due to its importance in deploying machine learning models in real-world applications.
In this paper we propose an uncertainty quantification approach by modelling the distribution of features.
We incorporate an efficient ensemble mechanism, namely batch-ensemble, to construct the batch-ensemble neural networks (BE-SNNs) and overcome the feature collapse problem.
We show that BE-SNNs yield superior performance on several OOD benchmarks, such as the Two-Moons dataset, the FashionMNIST vs MNIST dataset, FashionM
arXiv Detail & Related papers (2022-06-26T16:00:22Z) - On the Intrinsic Structures of Spiking Neural Networks [66.57589494713515]
Recent years have emerged a surge of interest in SNNs owing to their remarkable potential to handle time-dependent and event-driven data.
There has been a dearth of comprehensive studies examining the impact of intrinsic structures within spiking computations.
This work delves deep into the intrinsic structures of SNNs, by elucidating their influence on the expressivity of SNNs.
arXiv Detail & Related papers (2022-06-21T09:42:30Z) - Pre-training Enhanced Spatial-temporal Graph Neural Network for
Multivariate Time Series Forecasting [13.441945545904504]
We propose a novel framework, in which STGNN is Enhanced by a scalable time series Pre-training model (STEP)
Specifically, we design a pre-training model to efficiently learn temporal patterns from very long-term history time series.
Our framework is capable of significantly enhancing downstream STGNNs, and our pre-training model aptly captures temporal patterns.
arXiv Detail & Related papers (2022-06-18T04:24:36Z) - Space-Time Graph Neural Networks [104.55175325870195]
We introduce space-time graph neural network (ST-GNN) to jointly process the underlying space-time topology of time-varying network data.
Our analysis shows that small variations in the network topology and time evolution of a system does not significantly affect the performance of ST-GNNs.
arXiv Detail & Related papers (2021-10-06T16:08:44Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - A journey in ESN and LSTM visualisations on a language task [77.34726150561087]
We trained ESNs and LSTMs on a Cross-Situationnal Learning (CSL) task.
The results are of three kinds: performance comparison, internal dynamics analyses and visualization of latent space.
arXiv Detail & Related papers (2020-12-03T08:32:01Z) - MTHetGNN: A Heterogeneous Graph Embedding Framework for Multivariate
Time Series Forecasting [4.8274015390665195]
We propose a novel end-to-end deep learning model, termed Multivariate Time Series Forecasting via Heterogeneous Graph Neural Networks (MTHetGNN)
To characterize complex relations among variables, a relation embedding module is designed in MTHetGNN, where each variable is regarded as a graph node.
A temporal embedding module is introduced for time series features extraction, where involving convolutional neural network (CNN) filters with different perception scales.
arXiv Detail & Related papers (2020-08-19T18:21:22Z) - Error-feedback stochastic modeling strategy for time series forecasting
with convolutional neural networks [11.162185201961174]
We propose a novel Error-feedback Modeling (ESM) strategy to construct a random Convolutional Network (ESM-CNN) Neural time series forecasting task.
The proposed ESM-CNN not only outperforms the state-of-art random neural networks, but also exhibits stronger predictive power and less computing overhead in comparison to trained state-of-art deep neural network models.
arXiv Detail & Related papers (2020-02-03T13:30:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.