Meta-Weight Graph Neural Network: Push the Limits Beyond Global
Homophily
- URL: http://arxiv.org/abs/2203.10280v1
- Date: Sat, 19 Mar 2022 09:27:38 GMT
- Title: Meta-Weight Graph Neural Network: Push the Limits Beyond Global
Homophily
- Authors: Xiaojun Ma, Qin Chen, Yuanyi Ren, Guojie Song, Liang Wang
- Abstract summary: Graph Neural Networks (GNNs) show strong expressive power on graph data mining.
However, not all graphs are homophilic, even in the same graph, the distributions may vary significantly.
We propose Meta Weight Graph Neural Network (MWGNN) to adaptively construct graph convolution layers for different nodes.
- Score: 24.408557217909316
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph Neural Networks (GNNs) show strong expressive power on graph data
mining, by aggregating information from neighbors and using the integrated
representation in the downstream tasks. The same aggregation methods and
parameters for each node in a graph are used to enable the GNNs to utilize the
homophily relational data. However, not all graphs are homophilic, even in the
same graph, the distributions may vary significantly. Using the same
convolution over all nodes may lead to the ignorance of various graph patterns.
Furthermore, many existing GNNs integrate node features and structure
identically, which ignores the distributions of nodes and further limits the
expressive power of GNNs. To solve these problems, we propose Meta Weight Graph
Neural Network (MWGNN) to adaptively construct graph convolution layers for
different nodes. First, we model the Node Local Distribution (NLD) from node
feature, topological structure and positional identity aspects with the
Meta-Weight. Then, based on the Meta-Weight, we generate the adaptive graph
convolutions to perform a node-specific weighted aggregation and boost the node
representations. Finally, we design extensive experiments on real-world and
synthetic benchmarks to evaluate the effectiveness of MWGNN. These experiments
show the excellent expressive power of MWGNN in dealing with graph data with
various distributions.
Related papers
- Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - Node-wise Localization of Graph Neural Networks [52.04194209002702]
Graph neural networks (GNNs) emerge as a powerful family of representation learning models on graphs.
We propose a node-wise localization of GNNs by accounting for both global and local aspects of the graph.
We conduct extensive experiments on four benchmark graphs, and consistently obtain promising performance surpassing the state-of-the-art GNNs.
arXiv Detail & Related papers (2021-10-27T10:02:03Z) - Explicit Pairwise Factorized Graph Neural Network for Semi-Supervised
Node Classification [59.06717774425588]
We propose the Explicit Pairwise Factorized Graph Neural Network (EPFGNN), which models the whole graph as a partially observed Markov Random Field.
It contains explicit pairwise factors to model output-output relations and uses a GNN backbone to model input-output relations.
We conduct experiments on various datasets, which shows that our model can effectively improve the performance for semi-supervised node classification on graphs.
arXiv Detail & Related papers (2021-07-27T19:47:53Z) - Breaking the Limit of Graph Neural Networks by Improving the
Assortativity of Graphs with Local Mixing Patterns [19.346133577539394]
Graph neural networks (GNNs) have achieved tremendous success on multiple graph-based learning tasks.
We focus on transforming the input graph into a computation graph which contains both proximity and structural information.
We show that adaptively choosing between structure and proximity leads to improved performance under diverse mixing.
arXiv Detail & Related papers (2021-06-11T19:18:34Z) - Towards Expressive Graph Representation [16.17079730998607]
Graph Neural Network (GNN) aggregates the neighborhood of each node into the node embedding.
We present a theoretical framework to design a continuous injective set function for neighborhood aggregation in GNN.
We validate the proposed expressive GNN for graph classification on multiple benchmark datasets.
arXiv Detail & Related papers (2020-10-12T03:13:41Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Multi-grained Semantics-aware Graph Neural Networks [13.720544777078642]
Graph Neural Networks (GNNs) are powerful techniques in representation learning for graphs.
This work proposes a unified model, AdamGNN, to interactively learn node and graph representations.
Experiments on 14 real-world graph datasets show that AdamGNN can significantly outperform 17 competing models on both node- and graph-wise tasks.
arXiv Detail & Related papers (2020-10-01T07:52:06Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.