Message Detouring: A Simple Yet Effective Cycle Representation for
Expressive Graph Learning
- URL: http://arxiv.org/abs/2402.08085v1
- Date: Mon, 12 Feb 2024 22:06:37 GMT
- Title: Message Detouring: A Simple Yet Effective Cycle Representation for
Expressive Graph Learning
- Authors: Ziquan Wei, Tingting Dan, Guorong Wu
- Abstract summary: We introduce the concept of textitmessage detouring to hierarchically characterize cycle representation throughout the entire graph.
Message detouring can significantly outperform current counterpart approaches on various benchmark datasets.
- Score: 4.085624738017079
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph learning is crucial in the fields of bioinformatics, social networks,
and chemicals. Although high-order graphlets, such as cycles, are critical to
achieving an informative graph representation for node classification, edge
prediction, and graph recognition, modeling high-order topological
characteristics poses significant computational challenges, restricting its
widespread applications in machine learning. To address this limitation, we
introduce the concept of \textit{message detouring} to hierarchically
characterize cycle representation throughout the entire graph, which
capitalizes on the contrast between the shortest and longest pathways within a
range of local topologies associated with each graph node. The topological
feature representations derived from our message detouring landscape
demonstrate comparable expressive power to high-order
\textit{Weisfeiler-Lehman} (WL) tests but much less computational demands. In
addition to the integration with graph kernel and message passing neural
networks, we present a novel message detouring neural network, which uses
Transformer backbone to integrate cycle representations across nodes and edges.
Aside from theoretical results, experimental results on expressiveness, graph
classification, and node classification show message detouring can
significantly outperform current counterpart approaches on various benchmark
datasets.
Related papers
- Unveiling Global Interactive Patterns across Graphs: Towards Interpretable Graph Neural Networks [31.29616732552006]
Graph Neural Networks (GNNs) have emerged as a prominent framework for graph mining.
This paper proposes a novel intrinsically interpretable scheme for graph classification.
Global Interactive Pattern (GIP) learning introduces learnable global interactive patterns to explicitly interpret decisions.
arXiv Detail & Related papers (2024-07-02T06:31:13Z) - State of the Art and Potentialities of Graph-level Learning [54.68482109186052]
Graph-level learning has been applied to many tasks including comparison, regression, classification, and more.
Traditional approaches to learning a set of graphs rely on hand-crafted features, such as substructures.
Deep learning has helped graph-level learning adapt to the growing scale of graphs by extracting features automatically and encoding graphs into low-dimensional representations.
arXiv Detail & Related papers (2023-01-14T09:15:49Z) - Shortest Path Networks for Graph Property Prediction [13.986963122264632]
Most graph neural network models rely on a particular message passing paradigm, where the idea is to iteratively propagate node representations of a graph to each node in the direct neighborhood.
We propose shortest path message passing neural networks, where the node representations of a graph are propagated to each node in the shortest path neighborhoods.
Our framework generalizes message passing neural networks, resulting in provably more expressive models.
arXiv Detail & Related papers (2022-06-02T12:04:29Z) - Hyperbolic Graph Neural Networks: A Review of Methods and Applications [55.5502008501764]
Graph neural networks generalize conventional neural networks to graph-structured data.
The performance of Euclidean models in graph-related learning is still bounded and limited by the representation ability of Euclidean geometry.
Recently, hyperbolic space has gained increasing popularity in processing graph data with tree-like structure and power-law distribution.
arXiv Detail & Related papers (2022-02-28T15:08:48Z) - Graph Decipher: A transparent dual-attention graph neural network to
understand the message-passing mechanism for the node classification [2.0047096160313456]
We propose a new transparent network called Graph Decipher to investigate the message-passing mechanism.
Our algorithm achieves state-of-the-art performance while imposing a substantially lower burden under the node classification task.
arXiv Detail & Related papers (2022-01-04T23:24:00Z) - Learning the Implicit Semantic Representation on Graph-Structured Data [57.670106959061634]
Existing representation learning methods in graph convolutional networks are mainly designed by describing the neighborhood of each node as a perceptual whole.
We propose a Semantic Graph Convolutional Networks (SGCN) that explores the implicit semantics by learning latent semantic-paths in graphs.
arXiv Detail & Related papers (2021-01-16T16:18:43Z) - Co-embedding of Nodes and Edges with Graph Neural Networks [13.020745622327894]
Graph embedding is a way to transform and encode the data structure in high dimensional and non-Euclidean feature space.
CensNet is a general graph embedding framework, which embeds both nodes and edges to a latent feature space.
Our approach achieves or matches the state-of-the-art performance in four graph learning tasks.
arXiv Detail & Related papers (2020-10-25T22:39:31Z) - Towards Deeper Graph Neural Networks [63.46470695525957]
Graph convolutions perform neighborhood aggregation and represent one of the most important graph operations.
Several recent studies attribute this performance deterioration to the over-smoothing issue.
We propose Deep Adaptive Graph Neural Network (DAGNN) to adaptively incorporate information from large receptive fields.
arXiv Detail & Related papers (2020-07-18T01:11:14Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - GCC: Graph Contrastive Coding for Graph Neural Network Pre-Training [62.73470368851127]
Graph representation learning has emerged as a powerful technique for addressing real-world problems.
We design Graph Contrastive Coding -- a self-supervised graph neural network pre-training framework.
We conduct experiments on three graph learning tasks and ten graph datasets.
arXiv Detail & Related papers (2020-06-17T16:18:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.