A Simple yet Effective Method for Graph Classification
- URL: http://arxiv.org/abs/2206.02404v1
- Date: Mon, 6 Jun 2022 07:24:44 GMT
- Title: A Simple yet Effective Method for Graph Classification
- Authors: Junran Wu, Shangzhe Li, Jianhao Li, Yicheng Pan and Ke Xu
- Abstract summary: We investigate the feasibility of improving graph classification performance while simplifying the learning process.
Inspired by structural entropy on graphs, we transform the data sample from graphs to coding trees.
We present a tree kernel and a convolutional network to implement our scheme for graph classification.
- Score: 7.397201068210497
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In deep neural networks, better results can often be obtained by increasing
the complexity of previously developed basic models. However, it is unclear
whether there is a way to boost performance by decreasing the complexity of
such models. Intuitively, given a problem, a simpler data structure comes with
a simpler algorithm. Here, we investigate the feasibility of improving graph
classification performance while simplifying the learning process. Inspired by
structural entropy on graphs, we transform the data sample from graphs to
coding trees, which is a simpler but essential structure for graph data.
Furthermore, we propose a novel message passing scheme, termed hierarchical
reporting, in which features are transferred from leaf nodes to root nodes by
following the hierarchical structure of coding trees. We then present a tree
kernel and a convolutional network to implement our scheme for graph
classification. With the designed message passing scheme, the tree kernel and
convolutional network have a lower runtime complexity of $O(n)$ than
Weisfeiler-Lehman subtree kernel and other graph neural networks of at least
$O(hm)$. We empirically validate our methods with several graph classification
benchmarks and demonstrate that they achieve better performance and lower
computational consumption than competing approaches.
Related papers
- CliquePH: Higher-Order Information for Graph Neural Networks through Persistent Homology on Clique Graphs [15.044471983688249]
We introduce a novel method that extracts information about higher-order structures in the graph.
Our method can lead to up to $31%$ improvements in test accuracy.
arXiv Detail & Related papers (2024-09-12T16:56:26Z) - Efficient Heterogeneous Graph Learning via Random Projection [58.4138636866903]
Heterogeneous Graph Neural Networks (HGNNs) are powerful tools for deep learning on heterogeneous graphs.
Recent pre-computation-based HGNNs use one-time message passing to transform a heterogeneous graph into regular-shaped tensors.
We propose a hybrid pre-computation-based HGNN, named Random Projection Heterogeneous Graph Neural Network (RpHGNN)
arXiv Detail & Related papers (2023-10-23T01:25:44Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - A Robust Stacking Framework for Training Deep Graph Models with
Multifaceted Node Features [61.92791503017341]
Graph Neural Networks (GNNs) with numerical node features and graph structure as inputs have demonstrated superior performance on various supervised learning tasks with graph data.
The best models for such data types in most standard supervised learning settings with IID (non-graph) data are not easily incorporated into a GNN.
Here we propose a robust stacking framework that fuses graph-aware propagation with arbitrary models intended for IID data.
arXiv Detail & Related papers (2022-06-16T22:46:33Z) - Optimal Propagation for Graph Neural Networks [51.08426265813481]
We propose a bi-level optimization approach for learning the optimal graph structure.
We also explore a low-rank approximation model for further reducing the time complexity.
arXiv Detail & Related papers (2022-05-06T03:37:00Z) - Structural Optimization Makes Graph Classification Simpler and Better [5.770986723520119]
We investigate the feasibility of improving graph classification performance while simplifying the model learning process.
Inspired by progress in structural information assessment, we optimize the given data sample from graphs to encoding trees.
We present an implementation of the scheme in a tree kernel and a convolutional network to perform graph classification.
arXiv Detail & Related papers (2021-09-05T08:54:38Z) - Self-Supervised Deep Graph Embedding with High-Order Information Fusion
for Community Discovery [3.6002285517472767]
The proposed algorithm uses self-supervised mechanism and different high-order information of graph to train multiple deep graph convolution neural networks.
The outputs of multiple graph convolution neural networks are fused to extract the representations of nodes which include the attribute and structure information of a graph.
arXiv Detail & Related papers (2021-02-05T17:22:28Z) - Representation Learning of Reconstructed Graphs Using Random Walk Graph
Convolutional Network [12.008472517000651]
We propose wGCN -- a novel framework that utilizes random walk to obtain the node-specific mesoscopic structures of the graph.
We believe that combining high-order local structural information can more efficiently explore the potential of the network.
arXiv Detail & Related papers (2021-01-02T10:31:14Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z) - Geometrically Principled Connections in Graph Neural Networks [66.51286736506658]
We argue geometry should remain the primary driving force behind innovation in the emerging field of geometric deep learning.
We relate graph neural networks to widely successful computer graphics and data approximation models: radial basis functions (RBFs)
We introduce affine skip connections, a novel building block formed by combining a fully connected layer with any graph convolution operator.
arXiv Detail & Related papers (2020-04-06T13:25:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.