Effective Eigendecomposition based Graph Adaptation for Heterophilic
Networks
- URL: http://arxiv.org/abs/2107.13312v1
- Date: Wed, 28 Jul 2021 12:14:07 GMT
- Title: Effective Eigendecomposition based Graph Adaptation for Heterophilic
Networks
- Authors: Vijay Lingam, Rahul Ragesh, Arun Iyer, Sundararajan Sellamanickam
- Abstract summary: We present an eigendecomposition based approach and propose EigenNetwork models that improve the performance of GNNs on heterophilic graphs.
Our approach achieves up to 11% improvement in performance over the state-of-the-art methods on heterophilic graphs.
- Score: 0.5309004257911242
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) exhibit excellent performance when graphs have
strong homophily property, i.e. connected nodes have the same labels. However,
they perform poorly on heterophilic graphs. Several approaches address the
issue of heterophily by proposing models that adapt the graph by optimizing
task-specific loss function using labelled data. These adaptations are made
either via attention or by attenuating or enhancing various
low-frequency/high-frequency signals, as needed for the task at hand. More
recent approaches adapt the eigenvalues of the graph. One important
interpretation of this adaptation is that these models select/weigh the
eigenvectors of the graph. Based on this interpretation, we present an
eigendecomposition based approach and propose EigenNetwork models that improve
the performance of GNNs on heterophilic graphs. Performance improvement is
achieved by learning flexible graph adaptation functions that modulate the
eigenvalues of the graph. Regularization of these functions via parameter
sharing helps to improve the performance even more. Our approach achieves up to
11% improvement in performance over the state-of-the-art methods on
heterophilic graphs.
Related papers
- Through the Dual-Prism: A Spectral Perspective on Graph Data
Augmentation for Graph Classification [71.36575018271405]
We introduce the Dual-Prism (DP) augmentation method, comprising DP-Noise and DP-Mask.
We find that keeping the low-frequency eigenvalues unchanged can preserve the critical properties at a large scale when generating augmented graphs.
arXiv Detail & Related papers (2024-01-18T12:58:53Z) - Permutation Equivariant Graph Framelets for Heterophilous Graph Learning [6.679929638714752]
We develop a new way to implement multi-scale extraction via constructing Haar-type graph framelets.
We show that our model can achieve the best performance on certain datasets of heterophilous graphs.
arXiv Detail & Related papers (2023-06-07T09:05:56Z) - Addressing Heterophily in Node Classification with Graph Echo State
Networks [11.52174067809364]
We address the challenges of heterophilic graphs with Graph Echo State Network (GESN) for node classification.
GESN is a reservoir computing model for graphs, where node embeddings are computed by an untrained message-passing function.
Our experiments show that reservoir models are able to achieve better or comparable accuracy with respect to most fully trained deep models.
arXiv Detail & Related papers (2023-05-14T19:42:31Z) - Similarity-aware Positive Instance Sampling for Graph Contrastive
Pre-training [82.68805025636165]
We propose to select positive graph instances directly from existing graphs in the training set.
Our selection is based on certain domain-specific pair-wise similarity measurements.
Besides, we develop an adaptive node-level pre-training method to dynamically mask nodes to distribute them evenly in the graph.
arXiv Detail & Related papers (2022-06-23T20:12:51Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Beyond Low-Pass Filters: Adaptive Feature Propagation on Graphs [6.018995094882323]
Graph neural networks (GNNs) have been extensively studied for prediction tasks on graphs.
Most GNNs assume local homophily, i.e., strong similarities in localneighborhoods.
We propose a flexible GNN model, which is capable of handling any graphs without beingrestricted by their underlying homophily.
arXiv Detail & Related papers (2021-03-26T00:35:36Z) - Scalable Graph Neural Networks for Heterogeneous Graphs [12.44278942365518]
Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data.
Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks.
In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities.
arXiv Detail & Related papers (2020-11-19T06:03:35Z) - Robust Optimization as Data Augmentation for Large-scale Graphs [117.2376815614148]
We propose FLAG (Free Large-scale Adversarial Augmentation on Graphs), which iteratively augments node features with gradient-based adversarial perturbations during training.
FLAG is a general-purpose approach for graph data, which universally works in node classification, link prediction, and graph classification tasks.
arXiv Detail & Related papers (2020-10-19T21:51:47Z) - Meta-path Free Semi-supervised Learning for Heterogeneous Networks [16.641434334366227]
Graph neural networks (GNNs) have been widely used in representation learning on graphs and achieved superior performance in tasks such as node classification.
In this paper, we propose simple and effective graph neural networks for heterogeneous graph, excluding the use of meta-paths.
arXiv Detail & Related papers (2020-10-18T06:01:58Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.