Graph Feature Gating Networks
- URL: http://arxiv.org/abs/2105.04493v1
- Date: Mon, 10 May 2021 16:33:58 GMT
- Title: Graph Feature Gating Networks
- Authors: Wei Jin, Xiaorui Liu, Yao Ma, Tyler Derr, Charu Aggarwal, Jiliang Tang
- Abstract summary: We propose a general graph feature gating network (GFGN) based on the graph signal denoising problem.
We also introduce three graph filters under GFGN to allow different levels of contributions from feature dimensions.
- Score: 31.20878472589719
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) have received tremendous attention due to their
power in learning effective representations for graphs. Most GNNs follow a
message-passing scheme where the node representations are updated by
aggregating and transforming the information from the neighborhood. Meanwhile,
they adopt the same strategy in aggregating the information from different
feature dimensions. However, suggested by social dimension theory and spectral
embedding, there are potential benefits to treat the dimensions differently
during the aggregation process. In this work, we investigate to enable
heterogeneous contributions of feature dimensions in GNNs. In particular, we
propose a general graph feature gating network (GFGN) based on the graph signal
denoising problem and then correspondingly introduce three graph filters under
GFGN to allow different levels of contributions from feature dimensions.
Extensive experiments on various real-world datasets demonstrate the
effectiveness and robustness of the proposed frameworks.
Related papers
- A Manifold Perspective on the Statistical Generalization of Graph Neural Networks [84.01980526069075]
Graph Neural Networks (GNNs) combine information from adjacent nodes by successive applications of graph convolutions.
We study the generalization gaps of GNNs on both node-level and graph-level tasks.
We show that the generalization gaps decrease with the number of nodes in the training graphs.
arXiv Detail & Related papers (2024-06-07T19:25:02Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - DEGREE: Decomposition Based Explanation For Graph Neural Networks [55.38873296761104]
We propose DEGREE to provide a faithful explanation for GNN predictions.
By decomposing the information generation and aggregation mechanism of GNNs, DEGREE allows tracking the contributions of specific components of the input graph to the final prediction.
We also design a subgraph level interpretation algorithm to reveal complex interactions between graph nodes that are overlooked by previous methods.
arXiv Detail & Related papers (2023-05-22T10:29:52Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Towards Better Generalization with Flexible Representation of
Multi-Module Graph Neural Networks [0.27195102129094995]
We use a random graph generator to investigate how the graph size and structural properties affect the predictive performance of GNNs.
We present specific evidence that the average node degree is a key feature in determining whether GNNs can generalize to unseen graphs.
We propose a multi- module GNN framework that allows the network to adapt flexibly to new graphs by generalizing a single canonical nonlinear transformation over aggregated inputs.
arXiv Detail & Related papers (2022-09-14T12:13:59Z) - Meta-Weight Graph Neural Network: Push the Limits Beyond Global
Homophily [24.408557217909316]
Graph Neural Networks (GNNs) show strong expressive power on graph data mining.
However, not all graphs are homophilic, even in the same graph, the distributions may vary significantly.
We propose Meta Weight Graph Neural Network (MWGNN) to adaptively construct graph convolution layers for different nodes.
arXiv Detail & Related papers (2022-03-19T09:27:38Z) - Overcoming Oversmoothness in Graph Convolutional Networks via Hybrid
Scattering Networks [11.857894213975644]
We propose a hybrid graph neural network (GNN) framework that combines traditional GCN filters with band-pass filters defined via the geometric scattering transform.
Our theoretical results establish the complementary benefits of the scattering filters to leverage structural information from the graph, while our experiments show the benefits of our method on various learning tasks.
arXiv Detail & Related papers (2022-01-22T00:47:41Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z) - Geom-GCN: Geometric Graph Convolutional Networks [15.783571061254847]
We propose a novel geometric aggregation scheme for graph neural networks to overcome the two weaknesses.
The proposed aggregation scheme is permutation-invariant and consists of three modules, node embedding, structural neighborhood, and bi-level aggregation.
We also present an implementation of the scheme in graph convolutional networks, termed Geom-GCN, to perform transductive learning on graphs.
arXiv Detail & Related papers (2020-02-13T00:03:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.