Two Sides of the Same Coin: Heterophily and Oversmoothing in Graph
Convolutional Neural Networks
- URL: http://arxiv.org/abs/2102.06462v1
- Date: Fri, 12 Feb 2021 11:52:34 GMT
- Title: Two Sides of the Same Coin: Heterophily and Oversmoothing in Graph
Convolutional Neural Networks
- Authors: Yujun Yan, Milad Hashemi, Kevin Swersky, Yaoqing Yang, Danai Koutra
- Abstract summary: We theoretically characterize the connections between heterophily and oversmoothing.
We design a model that addresses the discrepancy in features and degrees between neighbors by incorporating signed messages and learned degree corrections.
Our experiments on 9 real networks show that our model achieves state-of-the-art performance under heterophily.
- Score: 33.25212467404069
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most graph neural networks (GNN) perform poorly in graphs where neighbors
typically have different features/classes (heterophily) and when stacking
multiple layers (oversmoothing). These two seemingly unrelated problems have
been studied independently, but there is recent empirical evidence that solving
one problem may benefit the other. In this work, going beyond empirical
observations, we theoretically characterize the connections between heterophily
and oversmoothing, both of which lead to indistinguishable node
representations. By modeling the change in node representations during message
propagation, we theoretically analyze the factors (e.g., degree, heterophily
level) that make the representations of nodes from different classes
indistinguishable. Our analysis highlights that (1) nodes with high heterophily
and nodes with low heterophily and low degrees relative to their neighbors
(degree discrepancy) trigger the oversmoothing problem, and (2) allowing
"negative" messages between neighbors can decouple the heterophily and
oversmoothing problems. Based on our insights, we design a model that addresses
the discrepancy in features and degrees between neighbors by incorporating
signed messages and learned degree corrections. Our experiments on 9 real
networks show that our model achieves state-of-the-art performance under
heterophily, and performs comparably to existing GNNs under low
heterophily(homophily). It also effectively addresses oversmoothing and even
benefits from multiple layers.
Related papers
- The Heterophilic Graph Learning Handbook: Benchmarks, Models, Theoretical Analysis, Applications and Challenges [101.83124435649358]
Homophily principle, ie nodes with the same labels or similar attributes are more likely to be connected.
Recent work has identified a non-trivial set of datasets where GNN's performance compared to the NN's is not satisfactory.
arXiv Detail & Related papers (2024-07-12T18:04:32Z) - The Heterophilic Snowflake Hypothesis: Training and Empowering GNNs for Heterophilic Graphs [59.03660013787925]
We introduce the Heterophily Snowflake Hypothesis and provide an effective solution to guide and facilitate research on heterophilic graphs.
Our observations show that our framework acts as a versatile operator for diverse tasks.
It can be integrated into various GNN frameworks, boosting performance in-depth and offering an explainable approach to choosing the optimal network depth.
arXiv Detail & Related papers (2024-06-18T12:16:00Z) - Heterophilous Distribution Propagation for Graph Neural Networks [23.897535976924722]
We propose heterophilous distribution propagation (HDP) for graph neural networks.
Instead of aggregating information from all neighborhoods, HDP adaptively separates the neighbors into homophilous and heterphilous parts.
We conduct extensive experiments on 9 benchmark datasets with different levels of homophily.
arXiv Detail & Related papers (2024-05-31T06:40:56Z) - Generation is better than Modification: Combating High Class Homophily Variance in Graph Anomaly Detection [51.11833609431406]
Homophily distribution differences between different classes are significantly greater than those in homophilic and heterophilic graphs.
We introduce a new metric called Class Homophily Variance, which quantitatively describes this phenomenon.
To mitigate its impact, we propose a novel GNN model named Homophily Edge Generation Graph Neural Network (HedGe)
arXiv Detail & Related papers (2024-03-15T14:26:53Z) - Demystifying Structural Disparity in Graph Neural Networks: Can One Size
Fit All? [61.35457647107439]
Most real-world homophilic and heterophilic graphs are comprised of a mixture of nodes in both homophilic and heterophilic structural patterns.
We provide evidence that Graph Neural Networks(GNNs) on node classification typically perform admirably on homophilic nodes.
We then propose a rigorous, non-i.i.d PAC-Bayesian generalization bound for GNNs, revealing reasons for the performance disparity.
arXiv Detail & Related papers (2023-06-02T07:46:20Z) - Ordered GNN: Ordering Message Passing to Deal with Heterophily and
Over-smoothing [24.86998128873837]
We propose to order the messages passing into the node representation, with specific blocks of neurons targeted for message passing within specific hops.
Experimental results on an extensive set of datasets show that our model can simultaneously achieve the state-of-the-art in both homophily and heterophily settings.
arXiv Detail & Related papers (2023-02-03T03:38:50Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Powerful Graph Convolutioal Networks with Adaptive Propagation Mechanism
for Homophily and Heterophily [38.50800951799888]
Graph Convolutional Networks (GCNs) have been widely applied in various fields due to their significant power on processing graph-structured data.
Existing methods deal with heterophily by mainly aggregating higher-order neighborhoods or combing the immediate representations.
This paper proposes a novel propagation mechanism, which can automatically change the propagation and aggregation process according to homophily or heterophily.
arXiv Detail & Related papers (2021-12-27T08:19:23Z) - Is Homophily a Necessity for Graph Neural Networks? [50.959340355849896]
Graph neural networks (GNNs) have shown great prowess in learning representations suitable for numerous graph-based machine learning tasks.
GNNs are widely believed to work well due to the homophily assumption ("like attracts like"), and fail to generalize to heterophilous graphs where dissimilar nodes connect.
Recent works design new architectures to overcome such heterophily-related limitations, citing poor baseline performance and new architecture improvements on a few heterophilous graph benchmark datasets as evidence for this notion.
In our experiments, we empirically find that standard graph convolutional networks (GCNs) can actually achieve better performance than
arXiv Detail & Related papers (2021-06-11T02:44:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.