Towards Temporal Edge Regression: A Case Study on Agriculture Trade
Between Nations
- URL: http://arxiv.org/abs/2308.07883v1
- Date: Tue, 15 Aug 2023 17:13:16 GMT
- Title: Towards Temporal Edge Regression: A Case Study on Agriculture Trade
Between Nations
- Authors: Lekang Jiang, Caiqi Zhang, Farimah Poursafaei, Shenyang Huang
- Abstract summary: Graph Neural Networks (GNNs) have shown promising performance in tasks on dynamic graphs.
In this paper, we explore the application of GNNs to edge regression tasks in both static and dynamic settings.
- Score: 4.612412025217201
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently, Graph Neural Networks (GNNs) have shown promising performance in
tasks on dynamic graphs such as node classification, link prediction and graph
regression. However, few work has studied the temporal edge regression task
which has important real-world applications. In this paper, we explore the
application of GNNs to edge regression tasks in both static and dynamic
settings, focusing on predicting food and agriculture trade values between
nations. We introduce three simple yet strong baselines and comprehensively
evaluate one static and three dynamic GNN models using the UN Trade dataset.
Our experimental results reveal that the baselines exhibit remarkably strong
performance across various settings, highlighting the inadequacy of existing
GNNs. We also find that TGN outperforms other GNN models, suggesting TGN is a
more appropriate choice for edge regression tasks. Moreover, we note that the
proportion of negative edges in the training samples significantly affects the
test performance. The companion source code can be found at:
https://github.com/scylj1/GNN_Edge_Regression.
Related papers
- Classic GNNs are Strong Baselines: Reassessing GNNs for Node Classification [7.14327815822376]
Graph Transformers (GTs) have emerged as popular alternatives to traditional Graph Neural Networks (GNNs)
In this paper, we reevaluate the performance of three classic GNN models (GCN, GAT, and GraphSAGE) against GTs.
arXiv Detail & Related papers (2024-06-13T10:53:33Z) - Online GNN Evaluation Under Test-time Graph Distribution Shifts [92.4376834462224]
A new research problem, online GNN evaluation, aims to provide valuable insights into the well-trained GNNs's ability to generalize to real-world unlabeled graphs.
We develop an effective learning behavior discrepancy score, dubbed LeBeD, to estimate the test-time generalization errors of well-trained GNN models.
arXiv Detail & Related papers (2024-03-15T01:28:08Z) - Learning to Reweight for Graph Neural Network [63.978102332612906]
Graph Neural Networks (GNNs) show promising results for graph tasks.
Existing GNNs' generalization ability will degrade when there exist distribution shifts between testing and training graph data.
We propose a novel nonlinear graph decorrelation method, which can substantially improve the out-of-distribution generalization ability.
arXiv Detail & Related papers (2023-12-19T12:25:10Z) - GNNEvaluator: Evaluating GNN Performance On Unseen Graphs Without Labels [81.93520935479984]
We study a new problem, GNN model evaluation, that aims to assess the performance of a specific GNN model trained on labeled and observed graphs.
We propose a two-stage GNN model evaluation framework, including (1) DiscGraph set construction and (2) GNNEvaluator training and inference.
Under the effective training supervision from the DiscGraph set, GNNEvaluator learns to precisely estimate node classification accuracy of the to-be-evaluated GNN model.
arXiv Detail & Related papers (2023-10-23T05:51:59Z) - Pitfalls in Link Prediction with Graph Neural Networks: Understanding
the Impact of Target-link Inclusion & Better Practices [28.88423949622]
Graph Neural Networks (GNNs) are remarkably successful in a variety of high-impact applications.
In link prediction, the common practices of including the edges being predicted in the graph at training and/or test have outsized impact on the performance of low-degree nodes.
We introduce an effective and efficient GNN training framework, SpotTarget, which leverages our insight on low-degree nodes.
arXiv Detail & Related papers (2023-06-01T16:56:04Z) - Shift-Robust GNNs: Overcoming the Limitations of Localized Graph
Training data [52.771780951404565]
Shift-Robust GNN (SR-GNN) is designed to account for distributional differences between biased training data and the graph's true inference distribution.
We show that SR-GNN outperforms other GNN baselines by accuracy, eliminating at least (40%) of the negative effects introduced by biased training data.
arXiv Detail & Related papers (2021-08-02T18:00:38Z) - A Biased Graph Neural Network Sampler with Near-Optimal Regret [57.70126763759996]
Graph neural networks (GNN) have emerged as a vehicle for applying deep network architectures to graph and relational data.
In this paper, we build upon existing work and treat GNN neighbor sampling as a multi-armed bandit problem.
We introduce a newly-designed reward function that introduces some degree of bias designed to reduce variance and avoid unstable, possibly-unbounded payouts.
arXiv Detail & Related papers (2021-03-01T15:55:58Z) - Learning to Drop: Robust Graph Neural Network via Topological Denoising [50.81722989898142]
We propose PTDNet, a parameterized topological denoising network, to improve the robustness and generalization performance of Graph Neural Networks (GNNs)
PTDNet prunes task-irrelevant edges by penalizing the number of edges in the sparsified graph with parameterized networks.
We show that PTDNet can improve the performance of GNNs significantly and the performance gain becomes larger for more noisy datasets.
arXiv Detail & Related papers (2020-11-13T18:53:21Z) - Self-Enhanced GNN: Improving Graph Neural Networks Using Model Outputs [20.197085398581397]
Graph neural networks (GNNs) have received much attention recently because of their excellent performance on graph-based tasks.
We propose self-enhanced GNN (SEG), which improves the quality of the input data using the outputs of existing GNN models.
SEG consistently improves the performance of well-known GNN models such as GCN, GAT and SGC across different datasets.
arXiv Detail & Related papers (2020-02-18T12:27:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.