Adaptive Dependency Learning Graph Neural Networks
- URL: http://arxiv.org/abs/2312.03903v1
- Date: Wed, 6 Dec 2023 20:56:23 GMT
- Title: Adaptive Dependency Learning Graph Neural Networks
- Authors: Abishek Sriramulu, Nicolas Fourrier and Christoph Bergmeir
- Abstract summary: We propose a hybrid approach combining neural networks and statistical structure learning models to self-learn dependencies.
We demonstrate significantly improved performance using our proposed approach on real-world benchmark datasets without a pre-defined dependency graph.
- Score: 5.653058780958551
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Neural Networks (GNN) have recently gained popularity in the
forecasting domain due to their ability to model complex spatial and temporal
patterns in tasks such as traffic forecasting and region-based demand
forecasting. Most of these methods require a predefined graph as input, whereas
in real-life multivariate time series problems, a well-predefined dependency
graph rarely exists. This requirement makes it harder for GNNs to be utilised
widely for multivariate forecasting problems in other domains such as retail or
energy. In this paper, we propose a hybrid approach combining neural networks
and statistical structure learning models to self-learn the dependencies and
construct a dynamically changing dependency graph from multivariate data aiming
to enable the use of GNNs for multivariate forecasting even when a well-defined
graph does not exist. The statistical structure modeling in conjunction with
neural networks provides a well-principled and efficient approach by bringing
in causal semantics to determine dependencies among the series. Finally, we
demonstrate significantly improved performance using our proposed approach on
real-world benchmark datasets without a pre-defined dependency graph.
Related papers
- Hierarchical Joint Graph Learning and Multivariate Time Series
Forecasting [0.16492989697868887]
We introduce a method of representing multivariate signals as nodes in a graph with edges indicating interdependency between them.
We leverage graph neural networks (GNN) and attention mechanisms to efficiently learn the underlying relationships within the time series data.
The effectiveness of our proposed model is evaluated across various real-world benchmark datasets designed for long-term forecasting tasks.
arXiv Detail & Related papers (2023-11-21T14:24:21Z) - FourierGNN: Rethinking Multivariate Time Series Forecasting from a Pure
Graph Perspective [48.00240550685946]
Current state-of-the-art graph neural network (GNN)-based forecasting methods usually require both graph networks (e.g., GCN) and temporal networks (e.g., LSTM) to capture inter-series (spatial) dynamics and intra-series (temporal) dependencies, respectively.
We propose a novel Fourier Graph Neural Network (FourierGNN) by stacking our proposed Fourier Graph Operator (FGO) to perform matrix multiplications in Fourier space.
Our experiments on seven datasets have demonstrated superior performance with higher efficiency and fewer parameters compared with state-of-the-
arXiv Detail & Related papers (2023-11-10T17:13:26Z) - Graph-enabled Reinforcement Learning for Time Series Forecasting with
Adaptive Intelligence [11.249626785206003]
We propose a novel approach for predicting time-series data using Graphical neural network (GNN) and monitoring with Reinforcement Learning (RL)
GNNs are able to explicitly incorporate the graph structure of the data into the model, allowing them to capture temporal dependencies in a more natural way.
This approach allows for more accurate predictions in complex temporal structures, such as those found in healthcare, traffic and weather forecasting.
arXiv Detail & Related papers (2023-09-18T22:25:12Z) - MGNNI: Multiscale Graph Neural Networks with Implicit Layers [53.75421430520501]
implicit graph neural networks (GNNs) have been proposed to capture long-range dependencies in underlying graphs.
We introduce and justify two weaknesses of implicit GNNs: the constrained expressiveness due to their limited effective range for capturing long-range dependencies, and their lack of ability to capture multiscale information on graphs at multiple resolutions.
We propose a multiscale graph neural network with implicit layers (MGNNI) which is able to model multiscale structures on graphs and has an expanded effective range for capturing long-range dependencies.
arXiv Detail & Related papers (2022-10-15T18:18:55Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Graph-Time Convolutional Neural Networks: Architecture and Theoretical
Analysis [12.995632804090198]
We introduce Graph-Time Convolutional Neural Networks (GTCNNs) as principled architecture to aid learning.
The approach can work with any type of product graph and we also introduce a parametric graph to learn also the producttemporal coupling.
Extensive numerical results on benchmark corroborate our findings and show the GTCNN compares favorably with state-of-the-art solutions.
arXiv Detail & Related papers (2022-06-30T10:20:52Z) - Long-term Spatio-temporal Forecasting via Dynamic Multiple-Graph
Attention [20.52864145999387]
Long-term tensor-temporal forecasting (LSTF) makes use of long-term dependency between spatial and temporal domains, contextual information, and inherent pattern in the data.
We propose new graph models to represent the contextual information of each node and the long-term parking revealed-temporal data dependency structure.
Our proposed approaches significantly improve the performance of existing graph neural network models in LSTF prediction tasks.
arXiv Detail & Related papers (2022-04-23T06:51:37Z) - EIGNN: Efficient Infinite-Depth Graph Neural Networks [51.97361378423152]
Graph neural networks (GNNs) are widely used for modelling graph-structured data in numerous applications.
Motivated by this limitation, we propose a GNN model with infinite depth, which we call Efficient Infinite-Depth Graph Neural Networks (EIGNN)
We show that EIGNN has a better ability to capture long-range dependencies than recent baselines, and consistently achieves state-of-the-art performance.
arXiv Detail & Related papers (2022-02-22T08:16:58Z) - Connecting the Dots: Multivariate Time Series Forecasting with Graph
Neural Networks [91.65637773358347]
We propose a general graph neural network framework designed specifically for multivariate time series data.
Our approach automatically extracts the uni-directed relations among variables through a graph learning module.
Our proposed model outperforms the state-of-the-art baseline methods on 3 of 4 benchmark datasets.
arXiv Detail & Related papers (2020-05-24T04:02:18Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.