Node Embedding using Mutual Information and Self-Supervision based
Bi-level Aggregation
- URL: http://arxiv.org/abs/2104.13014v1
- Date: Tue, 27 Apr 2021 07:32:57 GMT
- Title: Node Embedding using Mutual Information and Self-Supervision based
Bi-level Aggregation
- Authors: Kashob Kumar Roy, Amit Roy, A K M Mahbubur Rahman, M Ashraful Amin and
Amin Ahsan Ali
- Abstract summary: Graph Neural Networks learn low dimensional representations of nodes by aggregating information from their neighborhood in graphs.
We exploit mutual information (MI) to define two types of neighborhood, 1) textitLocal Neighborhood where nodes are densely connected within a community and each node would share higher MI with its neighbors, and 2) textitNon-Local Neighborhood where MI-based node clustering is introduced to assemble informative but graphically distant nodes in the same cluster.
We show that our model significantly outperforms the state-of-the-art methods in a wide range of assortative and disassortative graphs
- Score: 2.7088996845250897
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph Neural Networks (GNNs) learn low dimensional representations of nodes
by aggregating information from their neighborhood in graphs. However,
traditional GNNs suffer from two fundamental shortcomings due to their local
($l$-hop neighborhood) aggregation scheme. First, not all nodes in the
neighborhood carry relevant information for the target node. Since GNNs do not
exclude noisy nodes in their neighborhood, irrelevant information gets
aggregated, which reduces the quality of the representation. Second,
traditional GNNs also fail to capture long-range non-local dependencies between
nodes. To address these limitations, we exploit mutual information (MI) to
define two types of neighborhood, 1) \textit{Local Neighborhood} where nodes
are densely connected within a community and each node would share higher MI
with its neighbors, and 2) \textit{Non-Local Neighborhood} where MI-based node
clustering is introduced to assemble informative but graphically distant nodes
in the same cluster. To generate node presentations, we combine the embeddings
generated by bi-level aggregation - local aggregation to aggregate features
from local neighborhoods to avoid noisy information and non-local aggregation
to aggregate features from non-local neighborhoods. Furthermore, we leverage
self-supervision learning to estimate MI with few labeled data. Finally, we
show that our model significantly outperforms the state-of-the-art methods in a
wide range of assortative and disassortative graphs.
Related papers
- NodeMixup: Tackling Under-Reaching for Graph Neural Networks [27.393295683072406]
Graph Neural Networks (GNNs) have become mainstream methods for solving the semi-supervised node classification problem.
Due to the uneven location distribution of labeled nodes in the graph, labeled nodes are only accessible to a small portion of unlabeled nodes, leading to the emphunder-reaching issue.
To tackle under-reaching for GNNs, we propose an architecture-agnostic method dubbed NodeMixup.
arXiv Detail & Related papers (2023-12-20T13:56:27Z) - RBA-GCN: Relational Bilevel Aggregation Graph Convolutional Network for
Emotion Recognition [38.87080348908327]
We present the relational bilevel aggregation graph convolutional network (RBA-GCN)
It consists of three modules: the graph generation module (GGM), similarity-based cluster building module (SCBM) and bilevel aggregation module (BiAM)
On both the IEMOCAP and MELD datasets, the weighted average F1 score of RBA-GCN has a 2.17$sim$5.21% improvement over that of the most advanced method.
arXiv Detail & Related papers (2023-08-18T11:29:12Z) - Contrastive Meta-Learning for Few-shot Node Classification [54.36506013228169]
Few-shot node classification aims to predict labels for nodes on graphs with only limited labeled nodes as references.
We create a novel contrastive meta-learning framework on graphs, named COSMIC, with two key designs.
arXiv Detail & Related papers (2023-06-27T02:22:45Z) - LSGNN: Towards General Graph Neural Network in Node Classification by
Local Similarity [59.41119013018377]
We propose to use the local similarity (LocalSim) to learn node-level weighted fusion, which can also serve as a plug-and-play module.
For better fusion, we propose a novel and efficient Initial Residual Difference Connection (IRDC) to extract more informative multi-hop information.
Our proposed method, namely Local Similarity Graph Neural Network (LSGNN), can offer comparable or superior state-of-the-art performance on both homophilic and heterophilic graphs.
arXiv Detail & Related papers (2023-05-07T09:06:11Z) - A Variational Edge Partition Model for Supervised Graph Representation
Learning [51.30365677476971]
This paper introduces a graph generative process to model how the observed edges are generated by aggregating the node interactions over a set of overlapping node communities.
We partition each edge into the summation of multiple community-specific weighted edges and use them to define community-specific GNNs.
A variational inference framework is proposed to jointly learn a GNN based inference network that partitions the edges into different communities, these community-specific GNNs, and a GNN based predictor that combines community-specific GNNs for the end classification task.
arXiv Detail & Related papers (2022-02-07T14:37:50Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Node-wise Localization of Graph Neural Networks [52.04194209002702]
Graph neural networks (GNNs) emerge as a powerful family of representation learning models on graphs.
We propose a node-wise localization of GNNs by accounting for both global and local aspects of the graph.
We conduct extensive experiments on four benchmark graphs, and consistently obtain promising performance surpassing the state-of-the-art GNNs.
arXiv Detail & Related papers (2021-10-27T10:02:03Z) - On Local Aggregation in Heterophilic Graphs [11.100606980915144]
We show that properly tuned classical GNNs and multi-layer perceptrons match or exceed the accuracy of recent long-range aggregation methods on heterophilic graphs.
We propose the Neighborhood Information Content(NIC) metric, which is a novel information-theoretic graph metric.
arXiv Detail & Related papers (2021-06-06T19:12:31Z) - Hop-Aware Dimension Optimization for Graph Neural Networks [11.341455005324104]
We propose a simple yet effective ladder-style GNN architecture, namely LADDER-GNN.
Specifically, we separate messages from different hops and assign different dimensions for them before concatenating them to obtain the node representation.
Results show that the proposed simple hop-aware representation learning solution can achieve state-of-the-art performance on most datasets.
arXiv Detail & Related papers (2021-05-30T10:12:56Z) - Towards Deeper Graph Neural Networks with Differentiable Group
Normalization [61.20639338417576]
Graph neural networks (GNNs) learn the representation of a node by aggregating its neighbors.
Over-smoothing is one of the key issues which limit the performance of GNNs as the number of layers increases.
We introduce two over-smoothing metrics and a novel technique, i.e., differentiable group normalization (DGN)
arXiv Detail & Related papers (2020-06-12T07:18:02Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.