Homophily modulates double descent generalization in graph convolution
networks
- URL: http://arxiv.org/abs/2212.13069v3
- Date: Tue, 23 Jan 2024 05:44:44 GMT
- Title: Homophily modulates double descent generalization in graph convolution
networks
- Authors: Cheng Shi, Liming Pan, Hong Hu and Ivan Dokmani\'c
- Abstract summary: We show how risk is shaped by the interplay between the graph noise, feature noise, and the number of training labels.
We use our analytic insights to improve performance of state-of-the-art graph convolution networks on heterophilic datasets.
- Score: 33.703222768801574
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) excel in modeling relational data such as
biological, social, and transportation networks, but the underpinnings of their
success are not well understood. Traditional complexity measures from
statistical learning theory fail to account for observed phenomena like the
double descent or the impact of relational semantics on generalization error.
Motivated by experimental observations of ``transductive'' double descent in
key networks and datasets, we use analytical tools from statistical physics and
random matrix theory to precisely characterize generalization in simple graph
convolution networks on the contextual stochastic block model. Our results
illuminate the nuances of learning on homophilic versus heterophilic data and
predict double descent whose existence in GNNs has been questioned by recent
work. We show how risk is shaped by the interplay between the graph noise,
feature noise, and the number of training labels. Our findings apply beyond
stylized models, capturing qualitative trends in real-world GNNs and datasets.
As a case in point, we use our analytic insights to improve performance of
state-of-the-art graph convolution networks on heterophilic datasets.
Related papers
- Unitary convolutions for learning on graphs and groups [0.9899763598214121]
We study unitary group convolutions, which allow for deeper networks that are more stable during training.
The main focus of the paper are graph neural networks, where we show that unitary graph convolutions provably avoid over-smoothing.
Our experimental results confirm that unitary graph convolutional networks achieve competitive performance on benchmark datasets.
arXiv Detail & Related papers (2024-10-07T21:09:14Z) - Heterogeneous Sheaf Neural Networks [17.664754528494132]
Heterogeneous graphs are commonly used to model relational structures in many real-world applications.
We propose using cellular sheaves to model the heterogeneity in the graph's underlying topology.
We introduce HetSheaf, a general framework for heterogeneous sheaf neural networks, and a series of heterogeneous sheaf predictors.
arXiv Detail & Related papers (2024-09-12T13:38:08Z) - Generalization of Graph Neural Networks is Robust to Model Mismatch [84.01980526069075]
Graph neural networks (GNNs) have demonstrated their effectiveness in various tasks supported by their generalization capabilities.
In this paper, we examine GNNs that operate on geometric graphs generated from manifold models.
Our analysis reveals the robustness of the GNN generalization in the presence of such model mismatch.
arXiv Detail & Related papers (2024-08-25T16:00:44Z) - The Heterophilic Graph Learning Handbook: Benchmarks, Models, Theoretical Analysis, Applications and Challenges [101.83124435649358]
Homophily principle, ie nodes with the same labels or similar attributes are more likely to be connected.
Recent work has identified a non-trivial set of datasets where GNN's performance compared to the NN's is not satisfactory.
arXiv Detail & Related papers (2024-07-12T18:04:32Z) - Graph Out-of-Distribution Generalization via Causal Intervention [69.70137479660113]
We introduce a conceptually simple yet principled approach for training robust graph neural networks (GNNs) under node-level distribution shifts.
Our method resorts to a new learning objective derived from causal inference that coordinates an environment estimator and a mixture-of-expert GNN predictor.
Our model can effectively enhance generalization with various types of distribution shifts and yield up to 27.4% accuracy improvement over state-of-the-arts on graph OOD generalization benchmarks.
arXiv Detail & Related papers (2024-02-18T07:49:22Z) - Revealing Decurve Flows for Generalized Graph Propagation [108.80758541147418]
This study addresses the limitations of the traditional analysis of message-passing, central to graph learning, by defining em textbfgeneralized propagation with directed and weighted graphs.
We include a preliminary exploration of learned propagation patterns in datasets, a first in the field.
arXiv Detail & Related papers (2024-02-13T14:13:17Z) - Universally Robust Graph Neural Networks by Preserving Neighbor
Similarity [5.660584039688214]
We introduce a novel robust model termed NSPGNN which incorporates a dual-kNN graphs pipeline to supervise the neighbor similarity-guided propagation.
Experiments on both homophilic and heterophilic graphs validate the universal robustness of NSPGNN compared to the state-of-the-art methods.
arXiv Detail & Related papers (2024-01-18T06:57:29Z) - Learning Graph Structure from Convolutional Mixtures [119.45320143101381]
We propose a graph convolutional relationship between the observed and latent graphs, and formulate the graph learning task as a network inverse (deconvolution) problem.
In lieu of eigendecomposition-based spectral methods, we unroll and truncate proximal gradient iterations to arrive at a parameterized neural network architecture that we call a Graph Deconvolution Network (GDN)
GDNs can learn a distribution of graphs in a supervised fashion, perform link prediction or edge-weight regression tasks by adapting the loss function, and they are inherently inductive.
arXiv Detail & Related papers (2022-05-19T14:08:15Z) - Heterogeneous Graph Neural Networks using Self-supervised Reciprocally
Contrastive Learning [102.9138736545956]
Heterogeneous graph neural network (HGNN) is a very popular technique for the modeling and analysis of heterogeneous graphs.
We develop for the first time a novel and robust heterogeneous graph contrastive learning approach, namely HGCL, which introduces two views on respective guidance of node attributes and graph topologies.
In this new approach, we adopt distinct but most suitable attribute and topology fusion mechanisms in the two views, which are conducive to mining relevant information in attributes and topologies separately.
arXiv Detail & Related papers (2022-04-30T12:57:02Z) - Beyond Homophily in Graph Neural Networks: Current Limitations and
Effective Designs [28.77753005139331]
We investigate the representation power of graph neural networks in a semi-supervised node classification task under heterophily or low homophily.
Many popular GNNs fail to generalize to this setting, and are even outperformed by models that ignore the graph structure.
We identify a set of key designs that boost learning from the graph structure under heterophily.
arXiv Detail & Related papers (2020-06-20T02:05:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.