Motif Graph Neural Network
- URL: http://arxiv.org/abs/2112.14900v3
- Date: Mon, 3 Jul 2023 09:27:25 GMT
- Title: Motif Graph Neural Network
- Authors: Xuexin Chen, Ruichu Cai, Yuan Fang, Min Wu, Zijian Li, Zhifeng Hao
- Abstract summary: Graph neural networks (GNNs) are the most popular model in graph embedding approaches.
Standard GNNs in the neighborhood aggregation paradigm suffer from limited discriminative power in distinguishing emphhigh-order graph structures.
We propose Motif Graph Neural Network (MGNN), a novel framework to better capture high-order structures.
- Score: 32.659902358658755
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graphs can model complicated interactions between entities, which naturally
emerge in many important applications. These applications can often be cast
into standard graph learning tasks, in which a crucial step is to learn
low-dimensional graph representations. Graph neural networks (GNNs) are
currently the most popular model in graph embedding approaches. However,
standard GNNs in the neighborhood aggregation paradigm suffer from limited
discriminative power in distinguishing \emph{high-order} graph structures as
opposed to \emph{low-order} structures. To capture high-order structures,
researchers have resorted to motifs and developed motif-based GNNs. However,
existing motif-based GNNs still often suffer from less discriminative power on
high-order structures. To overcome the above limitations, we propose Motif
Graph Neural Network (MGNN), a novel framework to better capture high-order
structures, hinging on our proposed motif redundancy minimization operator and
injective motif combination. First, MGNN produces a set of node representations
w.r.t. each motif. The next phase is our proposed redundancy minimization among
motifs which compares the motifs with each other and distills the features
unique to each motif. Finally, MGNN performs the updating of node
representations by combining multiple representations from different motifs. In
particular, to enhance the discriminative power, MGNN utilizes an injective
function to combine the representations w.r.t. different motifs. We further
show that our proposed architecture increases the expressive power of GNNs with
a theoretical analysis. We demonstrate that MGNN outperforms state-of-the-art
methods on seven public benchmarks on both node classification and graph
classification tasks.
Related papers
- Logical Expressiveness of Graph Neural Networks with Hierarchical Node Individualization [0.6215404942415159]
Hierarchical Ego Graph Neural Networks (HEGNNs) are an expressive extension of graph neural networks (GNNs)<n>HEGNNs generalize subgraph-GNNs and form a hierarchy of increasingly expressive models that can distinguish graphs up to isomorphism.<n>Our experimental results confirm the practical feasibility of HEGNNs and show benefits in comparison with traditional GNN architectures.
arXiv Detail & Related papers (2025-06-16T18:39:31Z) - Revisiting Graph Neural Networks on Graph-level Tasks: Comprehensive Experiments, Analysis, and Improvements [54.006506479865344]
We propose a unified evaluation framework for graph-level Graph Neural Networks (GNNs)
This framework provides a standardized setting to evaluate GNNs across diverse datasets.
We also propose a novel GNN model with enhanced expressivity and generalization capabilities.
arXiv Detail & Related papers (2025-01-01T08:48:53Z) - Uplifting the Expressive Power of Graph Neural Networks through Graph
Partitioning [3.236774847052122]
We study the expressive power of graph neural networks through the lens of graph partitioning.
We introduce a novel GNN architecture, namely Graph Partitioning Neural Networks (GPNNs)
arXiv Detail & Related papers (2023-12-14T06:08:35Z) - Gradient Gating for Deep Multi-Rate Learning on Graphs [62.25886489571097]
We present Gradient Gating (G$2$), a novel framework for improving the performance of Graph Neural Networks (GNNs)
Our framework is based on gating the output of GNN layers with a mechanism for multi-rate flow of message passing information across nodes of the underlying graph.
arXiv Detail & Related papers (2022-10-02T13:19:48Z) - High-Order Pooling for Graph Neural Networks with Tensor Decomposition [23.244580796300166]
Graph Neural Networks (GNNs) are attracting growing attention due to their effectiveness and flexibility in modeling a variety of graph-structured data.
We propose the Graphized Neural Network (tGNN), a highly expressive GNN architecture relying on tensor decomposition to model high-order non-linear node interactions.
arXiv Detail & Related papers (2022-05-24T01:12:54Z) - Incorporating Heterophily into Graph Neural Networks for Graph Classification [6.709862924279403]
Graph Neural Networks (GNNs) often assume strong homophily for graph classification, seldom considering heterophily.
We develop a novel GNN architecture called IHGNN (short for Incorporating Heterophily into Graph Neural Networks)
We empirically validate IHGNN on various graph datasets and demonstrate that it outperforms the state-of-the-art GNNs for graph classification.
arXiv Detail & Related papers (2022-03-15T06:48:35Z) - From Stars to Subgraphs: Uplifting Any GNN with Local Structure
Awareness [23.279464786779787]
We introduce a general framework to uplift any MPNN to be more expressive.
Our framework is strictly more powerful than 1&2-WL, and is not less powerful than 3-WL.
Our method sets new state-of-the-art performance by large margins for several well-known graph ML tasks.
arXiv Detail & Related papers (2021-10-07T19:08:08Z) - Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks [67.25782890241496]
We propose a higher-order Attribute-Enhancing Graph Neural Network (HAEGNN) for heterogeneous network representation learning.
HAEGNN simultaneously incorporates meta-paths and meta-graphs for rich, heterogeneous semantics.
It shows superior performance against the state-of-the-art methods in node classification, node clustering, and visualization.
arXiv Detail & Related papers (2021-04-16T04:56:38Z) - Counting Substructures with Higher-Order Graph Neural Networks:
Possibility and Impossibility Results [58.277290855841976]
We study tradeoffs of computational cost and expressive power of Graph Neural Networks (GNNs)
We show that a new model can count subgraphs of size $k$, and thereby overcomes a known limitation of low-order GNNs.
In several cases, the proposed algorithm can greatly reduce computational complexity compared to the existing higher-order $k$-GNNs.
arXiv Detail & Related papers (2020-12-06T03:42:54Z) - Towards Expressive Graph Representation [16.17079730998607]
Graph Neural Network (GNN) aggregates the neighborhood of each node into the node embedding.
We present a theoretical framework to design a continuous injective set function for neighborhood aggregation in GNN.
We validate the proposed expressive GNN for graph classification on multiple benchmark datasets.
arXiv Detail & Related papers (2020-10-12T03:13:41Z) - The Surprising Power of Graph Neural Networks with Random Node
Initialization [54.4101931234922]
Graph neural networks (GNNs) are effective models for representation learning on relational data.
Standard GNNs are limited in their expressive power, as they cannot distinguish beyond the capability of the Weisfeiler-Leman graph isomorphism.
In this work, we analyze the expressive power of GNNs with random node (RNI)
We prove that these models are universal, a first such result for GNNs not relying on computationally demanding higher-order properties.
arXiv Detail & Related papers (2020-10-02T19:53:05Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.