Finding Heterophilic Neighbors via Confidence-based Subgraph Matching
for Semi-supervised Node Classification
- URL: http://arxiv.org/abs/2302.09755v2
- Date: Wed, 12 Apr 2023 08:57:05 GMT
- Title: Finding Heterophilic Neighbors via Confidence-based Subgraph Matching
for Semi-supervised Node Classification
- Authors: Yoonhyuk Choi, Jiho Choi, Taewook Ko, Chong-Kwon Kim
- Abstract summary: Graph Neural Networks (GNNs) have proven to be powerful in many graph-based applications.
However, they fail to generalize well under heterophilic setups.
- Score: 1.3190581566723918
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have proven to be powerful in many graph-based
applications. However, they fail to generalize well under heterophilic setups,
where neighbor nodes have different labels. To address this challenge, we
employ a confidence ratio as a hyper-parameter, assuming that some of the edges
are disassortative (heterophilic). Here, we propose a two-phased algorithm.
Firstly, we determine edge coefficients through subgraph matching using a
supplementary module. Then, we apply GNNs with a modified label propagation
mechanism to utilize the edge coefficients effectively. Specifically, our
supplementary module identifies a certain proportion of task-irrelevant edges
based on a given confidence ratio. Using the remaining edges, we employ the
widely used optimal transport to measure the similarity between two nodes with
their subgraphs. Finally, using the coefficients as supplementary information
on GNNs, we improve the label propagation mechanism which can prevent two nodes
with smaller weights from being closer. The experiments on benchmark datasets
show that our model alleviates over-smoothing and improves performance.
Related papers
- The Heterophilic Snowflake Hypothesis: Training and Empowering GNNs for Heterophilic Graphs [59.03660013787925]
We introduce the Heterophily Snowflake Hypothesis and provide an effective solution to guide and facilitate research on heterophilic graphs.
Our observations show that our framework acts as a versatile operator for diverse tasks.
It can be integrated into various GNN frameworks, boosting performance in-depth and offering an explainable approach to choosing the optimal network depth.
arXiv Detail & Related papers (2024-06-18T12:16:00Z) - Neighborhood Homophily-based Graph Convolutional Network [4.511171093050241]
Graph neural networks (GNNs) have been proved powerful in graph-oriented tasks.
Many real-world graphs are heterophilous, challenging the homophily assumption of classical GNNs.
Recent studies propose new metrics to characterize the homophily, but rarely consider the correlation of the proposed metrics and models.
In this paper, we first design a new metric, Neighborhood Homophily (textitNH), to measure the label complexity or purity in node neighborhoods.
arXiv Detail & Related papers (2023-01-24T07:56:44Z) - Bring Your Own View: Graph Neural Networks for Link Prediction with
Personalized Subgraph Selection [57.34881616131377]
We introduce a Personalized Subgraph Selector (PS2) as a plug-and-play framework to automatically, personally, and inductively identify optimal subgraphs for different edges.
PS2 is instantiated as a bi-level optimization problem that can be efficiently solved differently.
We suggest a brand-new angle towards GNNLP training: by first identifying the optimal subgraphs for edges; and then focusing on training the inference model by using the sampled subgraphs.
arXiv Detail & Related papers (2022-12-23T17:30:19Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Learnt Sparsification for Interpretable Graph Neural Networks [5.527927312898106]
We propose a novel method called Kedge for explicitly sparsifying the underlying graph by removing unnecessary neighbors.
Kedge learns edge masks in a modular fashion trained with any GNN allowing for gradient based optimization.
We show that Kedge effectively counters the over-smoothing phenomena in deep GNNs by maintaining good task performance with increasing GNN layers.
arXiv Detail & Related papers (2021-06-23T16:04:25Z) - Sequential Graph Convolutional Network for Active Learning [53.99104862192055]
We propose a novel pool-based Active Learning framework constructed on a sequential Graph Convolution Network (GCN)
With a small number of randomly sampled images as seed labelled examples, we learn the parameters of the graph to distinguish labelled vs unlabelled nodes.
We exploit these characteristics of GCN to select the unlabelled examples which are sufficiently different from labelled ones.
arXiv Detail & Related papers (2020-06-18T00:55:10Z) - Towards Deeper Graph Neural Networks with Differentiable Group
Normalization [61.20639338417576]
Graph neural networks (GNNs) learn the representation of a node by aggregating its neighbors.
Over-smoothing is one of the key issues which limit the performance of GNNs as the number of layers increases.
We introduce two over-smoothing metrics and a novel technique, i.e., differentiable group normalization (DGN)
arXiv Detail & Related papers (2020-06-12T07:18:02Z) - Bilinear Graph Neural Network with Neighbor Interactions [106.80781016591577]
Graph Neural Network (GNN) is a powerful model to learn representations and make predictions on graph data.
We propose a new graph convolution operator, which augments the weighted sum with pairwise interactions of the representations of neighbor nodes.
We term this framework as Bilinear Graph Neural Network (BGNN), which improves GNN representation ability with bilinear interactions between neighbor nodes.
arXiv Detail & Related papers (2020-02-10T06:43:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.