Spatial-Temporal Identity: A Simple yet Effective Baseline for
Multivariate Time Series Forecasting
- URL: http://arxiv.org/abs/2208.05233v1
- Date: Wed, 10 Aug 2022 09:25:43 GMT
- Title: Spatial-Temporal Identity: A Simple yet Effective Baseline for
Multivariate Time Series Forecasting
- Authors: Zezhi Shao, Zhao Zhang, Fei Wang, Wei Wei, Yongjun Xu
- Abstract summary: We explore the critical factors of MTS forecasting and design a model that is as powerful as STGNNs, but more concise and efficient.
We identify the indistinguishability of samples in both spatial and temporal dimensions as a key bottleneck, and propose a simple yet effective baseline for MTS forecasting.
These results suggest that we can design efficient and effective models as long as they solve the indistinguishability of samples, without being limited to STGNNs.
- Score: 17.84296081495185
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Multivariate Time Series (MTS) forecasting plays a vital role in a wide range
of applications. Recently, Spatial-Temporal Graph Neural Networks (STGNNs) have
become increasingly popular MTS forecasting methods due to their
state-of-the-art performance. However, recent works are becoming more
sophisticated with limited performance improvements. This phenomenon motivates
us to explore the critical factors of MTS forecasting and design a model that
is as powerful as STGNNs, but more concise and efficient. In this paper, we
identify the indistinguishability of samples in both spatial and temporal
dimensions as a key bottleneck, and propose a simple yet effective baseline for
MTS forecasting by attaching Spatial and Temporal IDentity information (STID),
which achieves the best performance and efficiency simultaneously based on
simple Multi-Layer Perceptrons (MLPs). These results suggest that we can design
efficient and effective models as long as they solve the indistinguishability
of samples, without being limited to STGNNs.
Related papers
- TCGPN: Temporal-Correlation Graph Pre-trained Network for Stock Forecasting [1.864621482724548]
We propose a novel approach called the Temporal-Correlation Graph Pre-trained Network (TCGPN) to address these limitations.
TCGPN utilize Temporal-correlation fusion encoder to get a mixed representation and pre-training method with carefully designed temporal and correlation pre-training tasks.
Experiments are conducted on real stock market data sets CSI300 and CSI500 that exhibit minimal periodicity.
arXiv Detail & Related papers (2024-07-26T05:27:26Z) - Not All Attention is Needed: Parameter and Computation Efficient Transfer Learning for Multi-modal Large Language Models [73.48675708831328]
We propose a novel parameter and computation efficient tuning method for Multi-modal Large Language Models (MLLMs)
The Efficient Attention Skipping (EAS) method evaluates the attention redundancy and skips the less important MHAs to speed up inference.
The experiments show that EAS not only retains high performance and parameter efficiency, but also greatly speeds up inference speed.
arXiv Detail & Related papers (2024-03-22T14:20:34Z) - FourierGNN: Rethinking Multivariate Time Series Forecasting from a Pure
Graph Perspective [48.00240550685946]
Current state-of-the-art graph neural network (GNN)-based forecasting methods usually require both graph networks (e.g., GCN) and temporal networks (e.g., LSTM) to capture inter-series (spatial) dynamics and intra-series (temporal) dependencies, respectively.
We propose a novel Fourier Graph Neural Network (FourierGNN) by stacking our proposed Fourier Graph Operator (FGO) to perform matrix multiplications in Fourier space.
Our experiments on seven datasets have demonstrated superior performance with higher efficiency and fewer parameters compared with state-of-the-
arXiv Detail & Related papers (2023-11-10T17:13:26Z) - ST-MLP: A Cascaded Spatio-Temporal Linear Framework with
Channel-Independence Strategy for Traffic Forecasting [47.74479442786052]
Current research on Spatio-Temporal Graph Neural Networks (STGNNs) often prioritizes complex designs, leading to computational burdens with only minor enhancements in accuracy.
We propose ST-MLP, a concise cascaded temporal-temporal model solely based on Multi-Layer Perceptron (MLP) modules and linear layers.
Empirical results demonstrate that ST-MLP outperforms state-of-the-art STGNNs and other models in terms of accuracy and computational efficiency.
arXiv Detail & Related papers (2023-08-14T23:34:59Z) - Contextualizing MLP-Mixers Spatiotemporally for Urban Data Forecast at Scale [54.15522908057831]
We propose an adapted version of the computationally-Mixer for STTD forecast at scale.
Our results surprisingly show that this simple-yeteffective solution can rival SOTA baselines when tested on several traffic benchmarks.
Our findings contribute to the exploration of simple-yet-effective models for real-world STTD forecasting.
arXiv Detail & Related papers (2023-07-04T05:19:19Z) - Disentangling Structured Components: Towards Adaptive, Interpretable and
Scalable Time Series Forecasting [52.47493322446537]
We develop a adaptive, interpretable and scalable forecasting framework, which seeks to individually model each component of the spatial-temporal patterns.
SCNN works with a pre-defined generative process of MTS, which arithmetically characterizes the latent structure of the spatial-temporal patterns.
Extensive experiments are conducted to demonstrate that SCNN can achieve superior performance over state-of-the-art models on three real-world datasets.
arXiv Detail & Related papers (2023-05-22T13:39:44Z) - Less Is More: Fast Multivariate Time Series Forecasting with Light
Sampling-oriented MLP Structures [18.592350352298553]
We introduce LightTS, a light deep learning architecture merely based on simple-based structures.
Compared with the existing state-of-the-art methods, LightTS demonstrates better performance on five of them and comparable performance on the rest.
LightTS is robust and has a much smaller variance in forecasting accuracy than previous SOTA methods in long sequence forecasting tasks.
arXiv Detail & Related papers (2022-07-04T04:03:00Z) - Pre-training Enhanced Spatial-temporal Graph Neural Network for
Multivariate Time Series Forecasting [13.441945545904504]
We propose a novel framework, in which STGNN is Enhanced by a scalable time series Pre-training model (STEP)
Specifically, we design a pre-training model to efficiently learn temporal patterns from very long-term history time series.
Our framework is capable of significantly enhancing downstream STGNNs, and our pre-training model aptly captures temporal patterns.
arXiv Detail & Related papers (2022-06-18T04:24:36Z) - A Generative Learning Approach for Spatio-temporal Modeling in Connected
Vehicular Network [55.852401381113786]
This paper proposes LaMI (Latency Model Inpainting), a novel framework to generate a comprehensive-temporal quality framework for wireless access latency of connected vehicles.
LaMI adopts the idea from image inpainting and synthesizing and can reconstruct the missing latency samples by a two-step procedure.
In particular, it first discovers the spatial correlation between samples collected in various regions using a patching-based approach and then feeds the original and highly correlated samples into a Varienational Autocoder (VAE)
arXiv Detail & Related papers (2020-03-16T03:43:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.