Strong Transitivity Relations and Graph Neural Networks
- URL: http://arxiv.org/abs/2401.01384v1
- Date: Mon, 1 Jan 2024 13:53:50 GMT
- Title: Strong Transitivity Relations and Graph Neural Networks
- Authors: Yassin Mohamadi and Mostafa Haghir Chehreghani
- Abstract summary: Local neighborhoods play a crucial role in embedding generation in graph-based learning.
We introduce Transitivity Graph Neural Network (TransGNN), which more than local node similarities.
We evaluate our model over several real-world datasets and showed that it considerably improves the performance of several well-known GNN models.
- Score: 1.19658449368018
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Local neighborhoods play a crucial role in embedding generation in
graph-based learning. It is commonly believed that nodes ought to have
embeddings that resemble those of their neighbors. In this research, we try to
carefully expand the concept of similarity from nearby neighborhoods to the
entire graph. We provide an extension of similarity that is based on
transitivity relations, which enables Graph Neural Networks (GNNs) to capture
both global similarities and local similarities over the whole graph. We
introduce Transitivity Graph Neural Network (TransGNN), which more than local
node similarities, takes into account global similarities by distinguishing
strong transitivity relations from weak ones and exploiting them. We evaluate
our model over several real-world datasets and showed that it considerably
improves the performance of several well-known GNN models, for tasks such as
node classification.
Related papers
- Exploring Consistency in Graph Representations:from Graph Kernels to Graph Neural Networks [4.235378870514331]
Graph Networks (GNNs) have emerged as a dominant approach in graph representation learning.
We bridge the gap between neural network methods and kernel approaches by enabling GNNs to consistently capture structures in their learned representations.
Inspired by these findings, we conjecture that the consistency in the similarities of graph representations across GNN layers is crucial in capturing relational structures and enhancing graph classification performance.
arXiv Detail & Related papers (2024-10-31T09:07:08Z) - Federated Graph Semantic and Structural Learning [54.97668931176513]
This paper reveals that local client distortion is brought by both node-level semantics and graph-level structure.
We postulate that a well-structural graph neural network possesses similarity for neighbors due to the inherent adjacency relationships.
We transform the adjacency relationships into the similarity distribution and leverage the global model to distill the relation knowledge into the local model.
arXiv Detail & Related papers (2024-06-27T07:08:28Z) - LSGNN: Towards General Graph Neural Network in Node Classification by
Local Similarity [59.41119013018377]
We propose to use the local similarity (LocalSim) to learn node-level weighted fusion, which can also serve as a plug-and-play module.
For better fusion, we propose a novel and efficient Initial Residual Difference Connection (IRDC) to extract more informative multi-hop information.
Our proposed method, namely Local Similarity Graph Neural Network (LSGNN), can offer comparable or superior state-of-the-art performance on both homophilic and heterophilic graphs.
arXiv Detail & Related papers (2023-05-07T09:06:11Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Breaking the Limit of Graph Neural Networks by Improving the
Assortativity of Graphs with Local Mixing Patterns [19.346133577539394]
Graph neural networks (GNNs) have achieved tremendous success on multiple graph-based learning tasks.
We focus on transforming the input graph into a computation graph which contains both proximity and structural information.
We show that adaptively choosing between structure and proximity leads to improved performance under diverse mixing.
arXiv Detail & Related papers (2021-06-11T19:18:34Z) - Reinforced Neighborhood Selection Guided Multi-Relational Graph Neural
Networks [68.9026534589483]
RioGNN is a novel Reinforced, recursive and flexible neighborhood selection guided multi-relational Graph Neural Network architecture.
RioGNN can learn more discriminative node embedding with enhanced explainability due to the recognition of individual importance of each relation.
arXiv Detail & Related papers (2021-04-16T04:30:06Z) - Node Similarity Preserving Graph Convolutional Networks [51.520749924844054]
Graph Neural Networks (GNNs) explore the graph structure and node features by aggregating and transforming information within node neighborhoods.
We propose SimP-GCN that can effectively and efficiently preserve node similarity while exploiting graph structure.
We validate the effectiveness of SimP-GCN on seven benchmark datasets including three assortative and four disassorative graphs.
arXiv Detail & Related papers (2020-11-19T04:18:01Z) - Factorizable Graph Convolutional Networks [90.59836684458905]
We introduce a novel graph convolutional network (GCN) that explicitly disentangles intertwined relations encoded in a graph.
FactorGCN takes a simple graph as input, and disentangles it into several factorized graphs.
We evaluate the proposed FactorGCN both qualitatively and quantitatively on the synthetic and real-world datasets.
arXiv Detail & Related papers (2020-10-12T03:01:40Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.