Radflow: A Recurrent, Aggregated, and Decomposable Model for Networks of
Time Series
- URL: http://arxiv.org/abs/2102.07289v1
- Date: Mon, 15 Feb 2021 00:57:28 GMT
- Title: Radflow: A Recurrent, Aggregated, and Decomposable Model for Networks of
Time Series
- Authors: Alasdair Tran, Alexander Mathews, Cheng Soon Ong, Lexing Xie
- Abstract summary: Radflow is a novel model for networks of time series that influence each other.
It embodies three key ideas: a recurrent neural network to obtain node embeddings that depend on time, the aggregation of the flow of influence from neighboring nodes with multi-head attention, and the multi-layer decomposition of time series.
We show that Radflow can learn different trends and seasonal patterns, that it is robust to missing nodes and edges, and that correlated temporal patterns among network neighbors reflect influence strength.
- Score: 77.47313102926017
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a new model for networks of time series that influence each other.
Graph structures among time series are found in diverse domains, such as web
traffic influenced by hyperlinks, product sales influenced by recommendation,
or urban transport volume influenced by road networks and weather. There has
been recent progress in graph modeling and in time series forecasting,
respectively, but an expressive and scalable approach for a network of series
does not yet exist. We introduce Radflow, a novel model that embodies three key
ideas: a recurrent neural network to obtain node embeddings that depend on
time, the aggregation of the flow of influence from neighboring nodes with
multi-head attention, and the multi-layer decomposition of time series. Radflow
naturally takes into account dynamic networks where nodes and edges change over
time, and it can be used for prediction and data imputation tasks. On
real-world datasets ranging from a few hundred to a few hundred thousand nodes,
we observe that Radflow variants are the best performing model across a wide
range of settings. The recurrent component in Radflow also outperforms N-BEATS,
the state-of-the-art time series model. We show that Radflow can learn
different trends and seasonal patterns, that it is robust to missing nodes and
edges, and that correlated temporal patterns among network neighbors reflect
influence strength. We curate WikiTraffic, the largest dynamic network of time
series with 366K nodes and 22M time-dependent links spanning five years. This
dataset provides an open benchmark for developing models in this area, with
applications that include optimizing resources for the web. More broadly,
Radflow has the potential to improve forecasts in correlated time series
networks such as the stock market, and impute missing measurements in
geographically dispersed networks of natural phenomena.
Related papers
- TimeGNN: Temporal Dynamic Graph Learning for Time Series Forecasting [20.03223916749058]
Time series forecasting lies at the core of important real-world applications in science and engineering.
We propose TimeGNN, a method that learns dynamic temporal graph representations.
TimeGNN achieves inference times 4 to 80 times faster than other state-of-the-art graph-based methods.
arXiv Detail & Related papers (2023-07-27T08:10:19Z) - Temporal Aggregation and Propagation Graph Neural Networks for Dynamic
Representation [67.26422477327179]
Temporal graphs exhibit dynamic interactions between nodes over continuous time.
We propose a novel method of temporal graph convolution with the whole neighborhood.
Our proposed TAP-GNN outperforms existing temporal graph methods by a large margin in terms of both predictive performance and online inference latency.
arXiv Detail & Related papers (2023-04-15T08:17:18Z) - Scalable Spatiotemporal Graph Neural Networks [14.415967477487692]
Graph neural networks (GNNs) are often the core component of the forecasting architecture.
In most pretemporal GNNs, the computational complexity scales up to a quadratic factor with the length of the sequence times the number of links in the graph.
We propose a scalable architecture that exploits an efficient encoding of both temporal and spatial dynamics.
arXiv Detail & Related papers (2022-09-14T09:47:38Z) - HyperTime: Implicit Neural Representation for Time Series [131.57172578210256]
Implicit neural representations (INRs) have recently emerged as a powerful tool that provides an accurate and resolution-independent encoding of data.
In this paper, we analyze the representation of time series using INRs, comparing different activation functions in terms of reconstruction accuracy and training convergence speed.
We propose a hypernetwork architecture that leverages INRs to learn a compressed latent representation of an entire time series dataset.
arXiv Detail & Related papers (2022-08-11T14:05:51Z) - Spatio-Temporal Joint Graph Convolutional Networks for Traffic
Forecasting [75.10017445699532]
Recent have shifted their focus towards formulating traffic forecasting as atemporal graph modeling problem.
We propose a novel approach for accurate traffic forecasting on road networks over multiple future time steps.
arXiv Detail & Related papers (2021-11-25T08:45:14Z) - Novel Features for Time Series Analysis: A Complex Networks Approach [62.997667081978825]
Time series data are ubiquitous in several domains as climate, economics and health care.
Recent conceptual approach relies on time series mapping to complex networks.
Network analysis can be used to characterize different types of time series.
arXiv Detail & Related papers (2021-10-11T13:46:28Z) - Event2Graph: Event-driven Bipartite Graph for Multivariate Time-series
Anomaly Detection [25.832983667044708]
We propose a dynamic bipartite graph structure to encode the inter-dependencies between time-series.
Based on this design, relations between time series can be explicitly modelled via dynamic connections to event nodes.
arXiv Detail & Related papers (2021-08-15T17:50:37Z) - SST-GNN: Simplified Spatio-temporal Traffic forecasting model using
Graph Neural Network [2.524966118517392]
We have designed a simplified S-temporal GNN(SST-GNN) that effectively encodes the dependency by separately aggregating different neighborhood.
We have shown that our model has significantly outperformed the state-of-the-art models on three real-world traffic datasets.
arXiv Detail & Related papers (2021-03-31T18:28:44Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z) - ForecastNet: A Time-Variant Deep Feed-Forward Neural Network
Architecture for Multi-Step-Ahead Time-Series Forecasting [6.043572971237165]
We propose ForecastNet, which uses a deep feed-forward architecture to provide a time-variant model.
ForecastNet is demonstrated to outperform statistical and deep learning benchmark models on several datasets.
arXiv Detail & Related papers (2020-02-11T01:03:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.