Simple Truncated SVD based Model for Node Classification on Heterophilic
Graphs
- URL: http://arxiv.org/abs/2106.12807v1
- Date: Thu, 24 Jun 2021 07:48:18 GMT
- Title: Simple Truncated SVD based Model for Node Classification on Heterophilic
Graphs
- Authors: Vijay Lingam, Rahul Ragesh, Arun Iyer, Sundararajan Sellamanickam
- Abstract summary: Graph Neural Networks (GNNs) have shown excellent performance on graphs that exhibit strong homophily.
Recent approaches have typically modified aggregation schemes, designed adaptive graph filters, etc. to address this limitation.
We propose a simple alternative method that exploits Truncated Singular Value Decomposition (TSVD) of topological structure and node features.
- Score: 0.5309004257911242
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) have shown excellent performance on graphs that
exhibit strong homophily with respect to the node labels i.e. connected nodes
have same labels. However, they perform poorly on heterophilic graphs. Recent
approaches have typically modified aggregation schemes, designed adaptive graph
filters, etc. to address this limitation. In spite of this, the performance on
heterophilic graphs can still be poor. We propose a simple alternative method
that exploits Truncated Singular Value Decomposition (TSVD) of topological
structure and node features. Our approach achieves up to ~30% improvement in
performance over state-of-the-art methods on heterophilic graphs. This work is
an early investigation into methods that differ from aggregation based
approaches. Our experimental results suggest that it might be important to
explore other alternatives to aggregation methods for heterophilic setting.
Related papers
- Leveraging Invariant Principle for Heterophilic Graph Structure Distribution Shifts [42.77503881972965]
Heterophilic Graph Neural Networks (HGNNs) have shown promising results for semi-supervised learning tasks on graphs.
How to learn invariant node representations on heterophilic graphs to handle this structure difference or distribution shifts remains unexplored.
We propose textbfHEI, a framework capable of generating invariant node representations through incorporating heterophily information.
arXiv Detail & Related papers (2024-08-18T14:10:34Z) - Generation is better than Modification: Combating High Class Homophily Variance in Graph Anomaly Detection [51.11833609431406]
Homophily distribution differences between different classes are significantly greater than those in homophilic and heterophilic graphs.
We introduce a new metric called Class Homophily Variance, which quantitatively describes this phenomenon.
To mitigate its impact, we propose a novel GNN model named Homophily Edge Generation Graph Neural Network (HedGe)
arXiv Detail & Related papers (2024-03-15T14:26:53Z) - GCNH: A Simple Method For Representation Learning On Heterophilous
Graphs [4.051099980410583]
Graph Neural Networks (GNNs) are well-suited for learning on homophilous graphs.
Recent works have proposed extensions to standard GNN architectures to improve performance on heterophilous graphs.
We propose GCN for Heterophily (GCNH), a simple yet effective GNN architecture applicable to both heterophilous and homophilous scenarios.
arXiv Detail & Related papers (2023-04-21T11:26:24Z) - Graph Contrastive Learning under Heterophily via Graph Filters [51.46061703680498]
Graph contrastive learning (CL) methods learn node representations in a self-supervised manner by maximizing the similarity between the augmented node representations obtained via a GNN-based encoder.
In this work, we propose an effective graph CL method, namely HLCL, for learning graph representations under heterophily.
Our extensive experiments show that HLCL outperforms state-of-the-art graph CL methods on benchmark datasets with heterophily, as well as large-scale real-world graphs, by up to 7%, and outperforms graph supervised learning methods on datasets with heterophily by up to 10%.
arXiv Detail & Related papers (2023-03-11T08:32:39Z) - Similarity-aware Positive Instance Sampling for Graph Contrastive
Pre-training [82.68805025636165]
We propose to select positive graph instances directly from existing graphs in the training set.
Our selection is based on certain domain-specific pair-wise similarity measurements.
Besides, we develop an adaptive node-level pre-training method to dynamically mask nodes to distribute them evenly in the graph.
arXiv Detail & Related papers (2022-06-23T20:12:51Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Is Homophily a Necessity for Graph Neural Networks? [50.959340355849896]
Graph neural networks (GNNs) have shown great prowess in learning representations suitable for numerous graph-based machine learning tasks.
GNNs are widely believed to work well due to the homophily assumption ("like attracts like"), and fail to generalize to heterophilous graphs where dissimilar nodes connect.
Recent works design new architectures to overcome such heterophily-related limitations, citing poor baseline performance and new architecture improvements on a few heterophilous graph benchmark datasets as evidence for this notion.
In our experiments, we empirically find that standard graph convolutional networks (GCNs) can actually achieve better performance than
arXiv Detail & Related papers (2021-06-11T02:44:00Z) - Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks [67.25782890241496]
We propose a higher-order Attribute-Enhancing Graph Neural Network (HAEGNN) for heterogeneous network representation learning.
HAEGNN simultaneously incorporates meta-paths and meta-graphs for rich, heterogeneous semantics.
It shows superior performance against the state-of-the-art methods in node classification, node clustering, and visualization.
arXiv Detail & Related papers (2021-04-16T04:56:38Z) - Beyond Low-Pass Filters: Adaptive Feature Propagation on Graphs [6.018995094882323]
Graph neural networks (GNNs) have been extensively studied for prediction tasks on graphs.
Most GNNs assume local homophily, i.e., strong similarities in localneighborhoods.
We propose a flexible GNN model, which is capable of handling any graphs without beingrestricted by their underlying homophily.
arXiv Detail & Related papers (2021-03-26T00:35:36Z) - Topology-aware Tensor Decomposition for Meta-graph Learning [33.70569156426479]
A common approach for extracting useful information from heterogeneous graphs is to use meta-graphs.
We propose a new viewpoint from tensor on learning meta-graphs.
We also propose a topology-aware tensor decomposition, called TENSUS, that reflects the structure of DAGs.
arXiv Detail & Related papers (2021-01-04T16:38:00Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.