Simplifying approach to Node Classification in Graph Neural Networks
- URL: http://arxiv.org/abs/2111.06748v1
- Date: Fri, 12 Nov 2021 14:53:22 GMT
- Title: Simplifying approach to Node Classification in Graph Neural Networks
- Authors: Sunil Kumar Maurya, Xin Liu and Tsuyoshi Murata
- Abstract summary: We decouple the node feature aggregation step and depth of graph neural network, and empirically analyze how different aggregated features play a role in prediction performance.
We show that not all features generated via aggregation steps are useful, and often using these less informative features can be detrimental to the performance of the GNN model.
We present a simple and shallow model, Feature Selection Graph Neural Network (FSGNN), and show empirically that the proposed model achieves comparable or even higher accuracy than state-of-the-art GNN models.
- Score: 7.057970273958933
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks have become one of the indispensable tools to learn
from graph-structured data, and their usefulness has been shown in wide variety
of tasks. In recent years, there have been tremendous improvements in
architecture design, resulting in better performance on various prediction
tasks. In general, these neural architectures combine node feature aggregation
and feature transformation using learnable weight matrix in the same layer.
This makes it challenging to analyze the importance of node features aggregated
from various hops and the expressiveness of the neural network layers. As
different graph datasets show varying levels of homophily and heterophily in
features and class label distribution, it becomes essential to understand which
features are important for the prediction tasks without any prior information.
In this work, we decouple the node feature aggregation step and depth of graph
neural network, and empirically analyze how different aggregated features play
a role in prediction performance. We show that not all features generated via
aggregation steps are useful, and often using these less informative features
can be detrimental to the performance of the GNN model. Through our
experiments, we show that learning certain subsets of these features can lead
to better performance on wide variety of datasets. We propose to use softmax as
a regularizer and "soft-selector" of features aggregated from neighbors at
different hop distances; and L2-Normalization over GNN layers. Combining these
techniques, we present a simple and shallow model, Feature Selection Graph
Neural Network (FSGNN), and show empirically that the proposed model achieves
comparable or even higher accuracy than state-of-the-art GNN models in nine
benchmark datasets for the node classification task, with remarkable
improvements up to 51.1%.
Related papers
- TANGNN: a Concise, Scalable and Effective Graph Neural Networks with Top-m Attention Mechanism for Graph Representation Learning [7.879217146851148]
We propose an innovative Graph Neural Network (GNN) architecture that integrates a Top-m attention mechanism aggregation component and a neighborhood aggregation component.
To assess the effectiveness of our proposed model, we have applied it to citation sentiment prediction, a novel task previously unexplored in the GNN field.
arXiv Detail & Related papers (2024-11-23T05:31:25Z) - DA-MoE: Addressing Depth-Sensitivity in Graph-Level Analysis through Mixture of Experts [70.21017141742763]
Graph neural networks (GNNs) are gaining popularity for processing graph-structured data.
Existing methods generally use a fixed number of GNN layers to generate representations for all graphs.
We propose the depth adaptive mixture of expert (DA-MoE) method, which incorporates two main improvements to GNN.
arXiv Detail & Related papers (2024-11-05T11:46:27Z) - Noise-Resilient Unsupervised Graph Representation Learning via Multi-Hop Feature Quality Estimation [53.91958614666386]
Unsupervised graph representation learning (UGRL) based on graph neural networks (GNNs)
We propose a novel UGRL method based on Multi-hop feature Quality Estimation (MQE)
arXiv Detail & Related papers (2024-07-29T12:24:28Z) - Hyperbolic Benchmarking Unveils Network Topology-Feature Relationship in GNN Performance [0.5416466085090772]
We introduce a comprehensive benchmarking framework for graph machine learning.
We generate synthetic networks with realistic topological properties and node feature vectors.
Results highlight the dependency of model performance on the interplay between network structure and node features.
arXiv Detail & Related papers (2024-06-04T20:40:06Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - Ensemble Learning for Graph Neural Networks [28.3650473174488]
Graph Neural Networks (GNNs) have shown success in various fields for learning from graph-structured data.
This paper investigates the application of ensemble learning techniques to improve the performance and robustness of GNNs.
arXiv Detail & Related papers (2023-10-22T03:55:13Z) - A Robust Stacking Framework for Training Deep Graph Models with
Multifaceted Node Features [61.92791503017341]
Graph Neural Networks (GNNs) with numerical node features and graph structure as inputs have demonstrated superior performance on various supervised learning tasks with graph data.
The best models for such data types in most standard supervised learning settings with IID (non-graph) data are not easily incorporated into a GNN.
Here we propose a robust stacking framework that fuses graph-aware propagation with arbitrary models intended for IID data.
arXiv Detail & Related papers (2022-06-16T22:46:33Z) - Breaking the Limit of Graph Neural Networks by Improving the
Assortativity of Graphs with Local Mixing Patterns [19.346133577539394]
Graph neural networks (GNNs) have achieved tremendous success on multiple graph-based learning tasks.
We focus on transforming the input graph into a computation graph which contains both proximity and structural information.
We show that adaptively choosing between structure and proximity leads to improved performance under diverse mixing.
arXiv Detail & Related papers (2021-06-11T19:18:34Z) - Improving Graph Neural Networks with Simple Architecture Design [7.057970273958933]
We introduce several key design strategies for graph neural networks.
We present a simple and shallow model, Feature Selection Graph Neural Network (FSGNN)
We show that the proposed model outperforms other state of the art GNN models and achieves up to 64% improvements in accuracy on node classification tasks.
arXiv Detail & Related papers (2021-05-17T06:46:01Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Policy-GNN: Aggregation Optimization for Graph Neural Networks [60.50932472042379]
Graph neural networks (GNNs) aim to model the local graph structures and capture the hierarchical patterns by aggregating the information from neighbors.
It is a challenging task to develop an effective aggregation strategy for each node, given complex graphs and sparse features.
We propose Policy-GNN, a meta-policy framework that models the sampling procedure and message passing of GNNs into a combined learning process.
arXiv Detail & Related papers (2020-06-26T17:03:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.