Heterogeneous Attributed Graph Learning via Neighborhood-Aware Star Kernels
- URL: http://arxiv.org/abs/2511.11245v1
- Date: Fri, 14 Nov 2025 12:45:22 GMT
- Title: Heterogeneous Attributed Graph Learning via Neighborhood-Aware Star Kernels
- Authors: Hong Huang, Chengyu Yao, Haiming Chen, Hang Gao,
- Abstract summary: Neighborhood-Aware Star Kernel (NASK) is a novel graph kernel designed for attributed graph learning.<n>NASK is positive definite, ensuring compatibility with kernel-based learning frameworks such as Graph Neural Networks.
- Score: 9.639624729255514
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Attributed graphs, typically characterized by irregular topologies and a mix of numerical and categorical attributes, are ubiquitous in diverse domains such as social networks, bioinformatics, and cheminformatics. While graph kernels provide a principled framework for measuring graph similarity, existing kernel methods often struggle to simultaneously capture heterogeneous attribute semantics and neighborhood information in attributed graphs. In this work, we propose the Neighborhood-Aware Star Kernel (NASK), a novel graph kernel designed for attributed graph learning. NASK leverages an exponential transformation of the Gower similarity coefficient to jointly model numerical and categorical features efficiently, and employs star substructures enhanced by Weisfeiler-Lehman iterations to integrate multi-scale neighborhood structural information. We theoretically prove that NASK is positive definite, ensuring compatibility with kernel-based learning frameworks such as SVMs. Extensive experiments are conducted on eleven attributed and four large-scale real-world graph benchmarks. The results demonstrate that NASK consistently achieves superior performance over sixteen state-of-the-art baselines, including nine graph kernels and seven Graph Neural Networks.
Related papers
- Metric Graph Kernels via the Tropical Torelli Map [0.0]
We propose new graph kernels grounded in the study of metric graphs via tropical algebraic geometry.<n>Our graph kernels are based on the geometry and topology of the underlying metric space.<n> Empirically, our kernels outperform existing methods in label-free settings.
arXiv Detail & Related papers (2025-05-17T20:00:50Z) - Revisiting Graph Neural Networks on Graph-level Tasks: Comprehensive Experiments, Analysis, and Improvements [54.006506479865344]
We propose a unified evaluation framework for graph-level Graph Neural Networks (GNNs)<n>This framework provides a standardized setting to evaluate GNNs across diverse datasets.<n>We also propose a novel GNN model with enhanced expressivity and generalization capabilities.
arXiv Detail & Related papers (2025-01-01T08:48:53Z) - Exploring Consistency in Graph Representations:from Graph Kernels to Graph Neural Networks [4.235378870514331]
Graph Networks (GNNs) have emerged as a dominant approach in graph representation learning.<n>We bridge the gap between neural network methods and kernel approaches by enabling GNNs to consistently capture structures in their learned representations.<n>Inspired by these findings, we conjecture that the consistency in the similarities of graph representations across GNN layers is crucial in capturing relational structures and enhancing graph classification performance.
arXiv Detail & Related papers (2024-10-31T09:07:08Z) - Tensor-Fused Multi-View Graph Contrastive Learning [12.412040359604163]
Graph contrastive learning (GCL) has emerged as a promising approach to enhance graph neural networks' (GNNs) ability to learn rich representations from unlabeled graph-structured data.<n>Current GCL models face challenges with computational demands and limited feature utilization.<n>We propose TensorMV-GCL, a novel framework that integrates extended persistent homology with GCL representations and facilitates multi-scale feature extraction.
arXiv Detail & Related papers (2024-10-20T01:40:12Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Kernel-based Joint Multiple Graph Learning and Clustering of Graph
Signals [2.4305626489408465]
We introduce Kernel-based joint Multiple GL and clustering of graph signals applications.
Experiments demonstrate that KMGL significantly enhances the robustness of GL clustering, particularly in scenarios with high noise levels.
These findings underscore the potential of KMGL for improving the performance of Graph Signal Processing methods in diverse real-world applications.
arXiv Detail & Related papers (2023-10-29T13:41:12Z) - Graph Neural Network-Inspired Kernels for Gaussian Processes in
Semi-Supervised Learning [4.644263115284322]
Graph neural networks (GNNs) emerged recently as a promising class of models for graph-structured data in semi-supervised learning.
We introduce this inductive bias into GPs to improve their predictive performance for graph-structured data.
We show that these graph-based kernels lead to competitive classification and regression performance, as well as advantages in time, compared with the respective GNNs.
arXiv Detail & Related papers (2023-02-12T01:07:56Z) - Federated Graph Classification over Non-IID Graphs [16.356867336591353]
Federated learning has emerged as an important paradigm for training machine learning models in different domains.
We propose a graph clustering federated learning framework that dynamically finds clusters of local systems based on the gradients of graph neural networks (GNNs)
arXiv Detail & Related papers (2021-06-25T04:25:29Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z) - Graph Neural Networks with Composite Kernels [60.81504431653264]
We re-interpret node aggregation from the perspective of kernel weighting.
We present a framework to consider feature similarity in an aggregation scheme.
We propose feature aggregation as the composition of the original neighbor-based kernel and a learnable kernel to encode feature similarities in a feature space.
arXiv Detail & Related papers (2020-05-16T04:44:29Z) - Tensor Graph Convolutional Networks for Multi-relational and Robust
Learning [74.05478502080658]
This paper introduces a tensor-graph convolutional network (TGCN) for scalable semi-supervised learning (SSL) from data associated with a collection of graphs, that are represented by a tensor.
The proposed architecture achieves markedly improved performance relative to standard GCNs, copes with state-of-the-art adversarial attacks, and leads to remarkable SSL performance over protein-to-protein interaction networks.
arXiv Detail & Related papers (2020-03-15T02:33:21Z) - Embedding Graph Auto-Encoder for Graph Clustering [90.8576971748142]
Graph auto-encoder (GAE) models are based on semi-supervised graph convolution networks (GCN)
We design a specific GAE-based model for graph clustering to be consistent with the theory, namely Embedding Graph Auto-Encoder (EGAE)
EGAE consists of one encoder and dual decoders.
arXiv Detail & Related papers (2020-02-20T09:53:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.