Graph Attention Recurrent Neural Networks for Correlated Time Series
Forecasting -- Full version
- URL: http://arxiv.org/abs/2103.10760v2
- Date: Mon, 22 Mar 2021 11:49:38 GMT
- Title: Graph Attention Recurrent Neural Networks for Correlated Time Series
Forecasting -- Full version
- Authors: Razvan-Gabriel Cirstea, Chenjuan Guo and Bin Yang
- Abstract summary: We consider a setting where multiple entities inter-act with each other over time and the time-varying statuses of the entities are represented as correlated time series.
To enable accurate forecasting on correlated time series, we proposes graph attention recurrent neural networks.
Experiments on a large real-world speed time series data set suggest that the proposed method is effective and outperforms the state-of-the-art in most settings.
- Score: 16.22449727526222
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We consider a setting where multiple entities inter-act with each other over
time and the time-varying statuses of the entities are represented as multiple
correlated time series. For example, speed sensors are deployed in different
locations in a road network, where the speed of a specific location across time
is captured by the corresponding sensor as a time series, resulting in multiple
speed time series from different locations, which are often correlated. To
enable accurate forecasting on correlated time series, we proposes graph
attention recurrent neural networks.First, we build a graph among different
entities by taking into account spatial proximity and employ a multi-head
attention mechanism to derive adaptive weight matrices for the graph to capture
the correlations among vertices (e.g., speeds at different locations) at
different timestamps. Second, we employ recurrent neural networks to take into
account temporal dependency while taking into account the adaptive weight
matrices learned from the first step to consider the correlations among time
series.Experiments on a large real-world speed time series data set suggest
that the proposed method is effective and outperforms the state-of-the-art in
most settings. This manuscript provides a full version of a workshop paper [1].
Related papers
- TimeSiam: A Pre-Training Framework for Siamese Time-Series Modeling [67.02157180089573]
Time series pre-training has recently garnered wide attention for its potential to reduce labeling expenses and benefit various downstream tasks.
This paper proposes TimeSiam as a simple but effective self-supervised pre-training framework for Time series based on Siamese networks.
arXiv Detail & Related papers (2024-02-04T13:10:51Z) - TimeGNN: Temporal Dynamic Graph Learning for Time Series Forecasting [20.03223916749058]
Time series forecasting lies at the core of important real-world applications in science and engineering.
We propose TimeGNN, a method that learns dynamic temporal graph representations.
TimeGNN achieves inference times 4 to 80 times faster than other state-of-the-art graph-based methods.
arXiv Detail & Related papers (2023-07-27T08:10:19Z) - Expressing Multivariate Time Series as Graphs with Time Series Attention
Transformer [14.172091921813065]
We propose the Time Series Attention Transformer (TSAT) for multivariate time series representation learning.
Using TSAT, we represent both temporal information and inter-dependencies of time series in terms of edge-enhanced dynamic graphs.
We show that TSAT clearly outerperforms six state-of-the-art baseline methods in various forecasting horizons.
arXiv Detail & Related papers (2022-08-19T12:25:56Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Learning the Evolutionary and Multi-scale Graph Structure for
Multivariate Time Series Forecasting [50.901984244738806]
We show how to model the evolutionary and multi-scale interactions of time series.
In particular, we first provide a hierarchical graph structure cooperated with the dilated convolution to capture the scale-specific correlations.
A unified neural network is provided to integrate the components above to get the final prediction.
arXiv Detail & Related papers (2022-06-28T08:11:12Z) - Spatio-Temporal Joint Graph Convolutional Networks for Traffic
Forecasting [75.10017445699532]
Recent have shifted their focus towards formulating traffic forecasting as atemporal graph modeling problem.
We propose a novel approach for accurate traffic forecasting on road networks over multiple future time steps.
arXiv Detail & Related papers (2021-11-25T08:45:14Z) - GACAN: Graph Attention-Convolution-Attention Networks for Traffic
Forecasting Based on Multi-granularity Time Series [9.559635281384134]
We propose Graph Attention-Convolution-Attention Networks (GACAN) for traffic forecasting.
The model uses a novel Att-Conv-Att block which contains two graph attention layers and one spectral-based GCN layer sandwiched in between.
The main novelty of the model is the integration of time series of four different time granularities.
arXiv Detail & Related papers (2021-10-27T10:21:13Z) - Novel Features for Time Series Analysis: A Complex Networks Approach [62.997667081978825]
Time series data are ubiquitous in several domains as climate, economics and health care.
Recent conceptual approach relies on time series mapping to complex networks.
Network analysis can be used to characterize different types of time series.
arXiv Detail & Related papers (2021-10-11T13:46:28Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.