QESK: Quantum-based Entropic Subtree Kernels for Graph Classification
- URL: http://arxiv.org/abs/2212.05228v1
- Date: Sat, 10 Dec 2022 07:10:03 GMT
- Title: QESK: Quantum-based Entropic Subtree Kernels for Graph Classification
- Authors: Lu Bai, Lixin Cui, Edwin R. Hancock
- Abstract summary: We propose a novel graph kernel, namely the Quantum-based Entropic Subtree Kernel (QESK) for Graph Classification.
We show how this AMM matrix can be employed to compute a series of entropic subtree representations associated with the classical Weisfeiler-Lehman (WL) algorithm.
We show that the proposed QESK kernel can significantly outperform state-of-the-art graph kernels and graph deep learning methods for graph classification problems.
- Score: 11.51839867040302
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we propose a novel graph kernel, namely the Quantum-based
Entropic Subtree Kernel (QESK), for Graph Classification. To this end, we
commence by computing the Average Mixing Matrix (AMM) of the Continuous-time
Quantum Walk (CTQW) evolved on each graph structure. Moreover, we show how this
AMM matrix can be employed to compute a series of entropic subtree
representations associated with the classical Weisfeiler-Lehman (WL) algorithm.
For a pair of graphs, the QESK kernel is defined by computing the
exponentiation of the negative Euclidean distance between their entropic
subtree representations, theoretically resulting in a positive definite graph
kernel. We show that the proposed QESK kernel not only encapsulates complicated
intrinsic quantum-based structural characteristics of graph structures through
the CTQW, but also theoretically addresses the shortcoming of ignoring the
effects of unshared substructures arising in state-of-the-art R-convolution
graph kernels. Moreover, unlike the classical R-convolution kernels, the
proposed QESK can discriminate the distinctions of isomorphic subtrees in terms
of the global graph structures, theoretically explaining the effectiveness.
Experiments indicate that the proposed QESK kernel can significantly outperform
state-of-the-art graph kernels and graph deep learning methods for graph
classification problems.
Related papers
- LSEnet: Lorentz Structural Entropy Neural Network for Deep Graph Clustering [59.89626219328127]
Graph clustering is a fundamental problem in machine learning.
Deep learning methods achieve the state-of-the-art results in recent years, but they still cannot work without predefined cluster numbers.
We propose to address this problem from a fresh perspective of graph information theory.
arXiv Detail & Related papers (2024-05-20T05:46:41Z) - Deep Hierarchical Graph Alignment Kernels [16.574634620245487]
We introduce Deep Hierarchical Graph Alignment Kernels (DHGAK) to resolve this problem.
Specifically, the relational substructures are hierarchically aligned to cluster distributions in their deep embedding space.
DHGAK is positive semi-definite and has linear separability in the Reproducing Kernel Hilbert Space.
arXiv Detail & Related papers (2024-05-09T05:08:30Z) - AKBR: Learning Adaptive Kernel-based Representations for Graph Classification [43.87164777893224]
We propose a new model to learn Adaptive Kernel-based Representations (AKBR) for graph classification.
The proposed AKBR approach aims to define an end-to-end representation learning model to construct an adaptive kernel matrix for graphs.
Experimental results show that the proposed AKBR model outperforms existing state-of-the-art graph kernels and deep learning methods on standard graph benchmarks.
arXiv Detail & Related papers (2024-03-24T13:01:05Z) - AERK: Aligned Entropic Reproducing Kernels through Continuous-time
Quantum Walks [17.95088104970343]
We develop an Aligned Entropic Reproducing Kernel (AERK) for graph classification.
For pairwise graphs, the proposed AERK kernel is defined by computing a reproducing kernel based similarity between the quantum Shannon entropies of their each pair of aligned vertices.
The experimental evaluation on standard graph datasets demonstrates that the proposed AERK kernel is able to outperform state-of-the-art graph kernels for graph classification tasks.
arXiv Detail & Related papers (2023-03-04T16:48:39Z) - HAQJSK: Hierarchical-Aligned Quantum Jensen-Shannon Kernels for Graph
Classification [17.95088104970343]
We propose a family of novel quantum kernels, namely the Hierarchical Aligned Quantum Jensen-Shannon Kernels (HAQJSK)
We show that the proposed HAQJSK kernels reflect richer intrinsic global graph characteristics in terms of the Continuous-Time Quantum Walk (CTQW)
Unlike the previous Quantum Jensen-Shannon Kernels associated with the QJSD and the CTQW, the proposed HAQJSK kernels can simultaneously guarantee the properties of permutation invariant and positive definiteness.
arXiv Detail & Related papers (2022-11-05T13:35:04Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Dirichlet Graph Variational Autoencoder [65.94744123832338]
We present Dirichlet Graph Variational Autoencoder (DGVAE) with graph cluster memberships as latent factors.
Motivated by the low pass characteristics in balanced graph cut, we propose a new variant of GNN named Heatts to encode the input graph into cluster memberships.
arXiv Detail & Related papers (2020-10-09T07:35:26Z) - Graph Neural Networks with Composite Kernels [60.81504431653264]
We re-interpret node aggregation from the perspective of kernel weighting.
We present a framework to consider feature similarity in an aggregation scheme.
We propose feature aggregation as the composition of the original neighbor-based kernel and a learnable kernel to encode feature similarities in a feature space.
arXiv Detail & Related papers (2020-05-16T04:44:29Z) - Embedding Graph Auto-Encoder for Graph Clustering [90.8576971748142]
Graph auto-encoder (GAE) models are based on semi-supervised graph convolution networks (GCN)
We design a specific GAE-based model for graph clustering to be consistent with the theory, namely Embedding Graph Auto-Encoder (EGAE)
EGAE consists of one encoder and dual decoders.
arXiv Detail & Related papers (2020-02-20T09:53:28Z) - Block-Approximated Exponential Random Graphs [77.4792558024487]
An important challenge in the field of exponential random graphs (ERGs) is the fitting of non-trivial ERGs on large graphs.
We propose an approximative framework to such non-trivial ERGs that result in dyadic independence (i.e., edge independent) distributions.
Our methods are scalable to sparse graphs consisting of millions of nodes.
arXiv Detail & Related papers (2020-02-14T11:42:16Z) - A Hierarchical Transitive-Aligned Graph Kernel for Un-attributed Graphs [11.51839867040302]
We develop a new graph kernel, namely the Hierarchical Transitive-Aligned kernel, by transitively aligning the vertices between graphs.
The proposed kernel can outperform state-of-the-art graph kernels on standard graph-based datasets in terms of the classification accuracy.
arXiv Detail & Related papers (2020-02-08T11:46:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.