Training Sensitivity in Graph Isomorphism Network
- URL: http://arxiv.org/abs/2008.09020v1
- Date: Wed, 19 Aug 2020 03:50:28 GMT
- Title: Training Sensitivity in Graph Isomorphism Network
- Authors: Md. Khaledur Rahman
- Abstract summary: Graph neural network (GNN) is a popular tool to learn the lower-dimensional representation of a graph.
This paper studies various alternative functions for a respective module using a diverse set of benchmark datasets.
- Score: 2.487445341407889
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Graph neural network (GNN) is a popular tool to learn the lower-dimensional
representation of a graph. It facilitates the applicability of machine learning
tasks on graphs by incorporating domain-specific features. There are various
options for underlying procedures (such as optimization functions, activation
functions, etc.) that can be considered in the implementation of GNN. However,
most of the existing tools are confined to one approach without any analysis.
Thus, this emerging field lacks a robust implementation ignoring the highly
irregular structure of the real-world graphs. In this paper, we attempt to fill
this gap by studying various alternative functions for a respective module
using a diverse set of benchmark datasets. Our empirical results suggest that
the generally used underlying techniques do not always perform well to capture
the overall structure from a set of graphs.
Related papers
- Graph Structure Prompt Learning: A Novel Methodology to Improve Performance of Graph Neural Networks [13.655670509818144]
We propose a novel Graph structure Prompt Learning method (GPL) to enhance the training of Graph networks (GNNs)
GPL employs task-independent graph structure losses to encourage GNNs to learn intrinsic graph characteristics while simultaneously solving downstream tasks.
In experiments on eleven real-world datasets, after being trained by neural prediction, GNNs significantly outperform their original performance on node classification, graph classification, and edge tasks.
arXiv Detail & Related papers (2024-07-16T03:59:18Z) - How Graph Neural Networks Learn: Lessons from Training Dynamics [80.41778059014393]
We study the training dynamics in function space of graph neural networks (GNNs)
We find that the gradient descent optimization of GNNs implicitly leverages the graph structure to update the learned function.
This finding offers new interpretable insights into when and why the learned GNN functions generalize.
arXiv Detail & Related papers (2023-10-08T10:19:56Z) - TouchUp-G: Improving Feature Representation through Graph-Centric
Finetuning [37.318961625795204]
Graph Neural Networks (GNNs) have become the state-of-the-art approach for many high-impact, real-world graph applications.
For feature-rich graphs, a prevalent practice involves utilizing a PM directly to generate features.
This practice is suboptimal because the node features extracted from PM are graph-agnostic and prevent GNNs from fully utilizing the potential correlations between the graph structure and node features.
arXiv Detail & Related papers (2023-09-25T05:44:40Z) - SimTeG: A Frustratingly Simple Approach Improves Textual Graph Learning [131.04781590452308]
We present SimTeG, a frustratingly Simple approach for Textual Graph learning.
We first perform supervised parameter-efficient fine-tuning (PEFT) on a pre-trained LM on the downstream task.
We then generate node embeddings using the last hidden states of finetuned LM.
arXiv Detail & Related papers (2023-08-03T07:00:04Z) - Learning Adaptive Neighborhoods for Graph Neural Networks [45.94778766867247]
Graph convolutional networks (GCNs) enable end-to-end learning on graph structured data.
We propose a novel end-to-end differentiable graph generator which builds graph topologies.
Our module can be readily integrated into existing pipelines involving graph convolution operations.
arXiv Detail & Related papers (2023-07-18T08:37:25Z) - Joint Feature and Differentiable $ k $-NN Graph Learning using Dirichlet
Energy [103.74640329539389]
We propose a deep FS method that simultaneously conducts feature selection and differentiable $ k $-NN graph learning.
We employ Optimal Transport theory to address the non-differentiability issue of learning $ k $-NN graphs in neural networks.
We validate the effectiveness of our model with extensive experiments on both synthetic and real-world datasets.
arXiv Detail & Related papers (2023-05-21T08:15:55Z) - Scalable Graph Neural Networks for Heterogeneous Graphs [12.44278942365518]
Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data.
Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks.
In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities.
arXiv Detail & Related papers (2020-11-19T06:03:35Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z) - Pointer Graph Networks [48.44209547013781]
Graph neural networks (GNNs) are typically applied to static graphs that are assumed to be known upfront.
Pointer Graph Networks (PGNs) augment sets or graphs with additional inferred edges for improved model generalisation ability.
PGNs allow each node to dynamically point to another node, followed by message passing over these pointers.
arXiv Detail & Related papers (2020-06-11T12:52:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.