Labeled Subgraph Entropy Kernel
- URL: http://arxiv.org/abs/2303.13543v1
- Date: Tue, 21 Mar 2023 12:27:21 GMT
- Title: Labeled Subgraph Entropy Kernel
- Authors: Chengyu Sun, Xing Ai, Zhihong Zhang, Edwin R Hancock
- Abstract summary: We propose a novel labeled subgraph entropy graph kernel, which performs well in structural similarity assessment.
We design a dynamic programming subgraph enumeration algorithm, which effectively reduces the time complexity.
In order to test our method, we apply several real-world datasets and assess the effects in different tasks.
- Score: 11.812319306576816
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In recent years, kernel methods are widespread in tasks of similarity
measuring. Specifically, graph kernels are widely used in fields of
bioinformatics, chemistry and financial data analysis. However, existing
methods, especially entropy based graph kernels are subject to large
computational complexity and the negligence of node-level information. In this
paper, we propose a novel labeled subgraph entropy graph kernel, which performs
well in structural similarity assessment. We design a dynamic programming
subgraph enumeration algorithm, which effectively reduces the time complexity.
Specially, we propose labeled subgraph, which enriches substructure topology
with semantic information. Analogizing the cluster expansion process of gas
cluster in statistical mechanics, we re-derive the partition function and
calculate the global graph entropy to characterize the network. In order to
test our method, we apply several real-world datasets and assess the effects in
different tasks. To capture more experiment details, we quantitatively and
qualitatively analyze the contribution of different topology structures.
Experimental results successfully demonstrate the effectiveness of our method
which outperforms several state-of-the-art methods.
Related papers
- Towards Subgraph Isomorphism Counting with Graph Kernels [45.687427850611314]
Subgraph isomorphism is known as #P-complete and requires exponential time to find the accurate solution.
We pioneeringly investigate the potential in counting subgraph isomorphisms and explore the augmentation of kernel capability through various variants.
We present the results of extensive experiments to demonstrate the effectiveness of the enhanced graph kernels and discuss promising directions for future research.
arXiv Detail & Related papers (2024-05-13T06:33:06Z) - Hypergraph Isomorphism Computation [20.21325386855039]
We propose a general hypergraph Weisfieler-Lehman kernel framework and implement two instances, which are Hypergraph Weisfeiler-Lehamn Subtree Kernel and Hypergraph Weisfeiler-Lehamn Hyperedge Kernel.
Results on hypergraph classification datasets show significant improvements compared to other typical kernel-based methods.
arXiv Detail & Related papers (2023-07-26T09:39:40Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Bures-Wasserstein Means of Graphs [60.42414991820453]
We propose a novel framework for defining a graph mean via embeddings in the space of smooth graph signal distributions.
By finding a mean in this embedding space, we can recover a mean graph that preserves structural information.
We establish the existence and uniqueness of the novel graph mean, and provide an iterative algorithm for computing it.
arXiv Detail & Related papers (2023-05-31T11:04:53Z) - SHGNN: Structure-Aware Heterogeneous Graph Neural Network [77.78459918119536]
This paper proposes a novel Structure-Aware Heterogeneous Graph Neural Network (SHGNN) to address the above limitations.
We first utilize a feature propagation module to capture the local structure information of intermediate nodes in the meta-path.
Next, we use a tree-attention aggregator to incorporate the graph structure information into the aggregation module on the meta-path.
Finally, we leverage a meta-path aggregator to fuse the information aggregated from different meta-paths.
arXiv Detail & Related papers (2021-12-12T14:18:18Z) - Learning subtree pattern importance for Weisfeiler-Lehmanbased graph
kernels [15.139294083028782]
We propose a method to learn the weights of subtree patterns in the framework of WWL kernels.
We present an efficient learning algorithm and also derive a generalization gap bound to show its convergence.
arXiv Detail & Related papers (2021-06-08T23:47:44Z) - Self-supervised Graph-level Representation Learning with Local and
Global Structure [71.45196938842608]
We propose a unified framework called Local-instance and Global-semantic Learning (GraphLoG) for self-supervised whole-graph representation learning.
Besides preserving the local similarities, GraphLoG introduces the hierarchical prototypes to capture the global semantic clusters.
An efficient online expectation-maximization (EM) algorithm is further developed for learning the model.
arXiv Detail & Related papers (2021-06-08T05:25:38Z) - Multilayer Clustered Graph Learning [66.94201299553336]
We use contrastive loss as a data fidelity term, in order to properly aggregate the observed layers into a representative graph.
Experiments show that our method leads to a clustered clusters w.r.t.
We learn a clustering algorithm for solving clustering problems.
arXiv Detail & Related papers (2020-10-29T09:58:02Z) - Density of States Graph Kernels [10.200937444995944]
Graph kernels are an established technique for quantifying similarity between graphs.
We recast random walk kernels under the more general framework of density of states.
We use our interpretation to construct scalable, composite density of states based graph kernels.
arXiv Detail & Related papers (2020-10-21T22:44:59Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.