KAGNNs: Kolmogorov-Arnold Networks meet Graph Learning
- URL: http://arxiv.org/abs/2406.18380v4
- Date: Thu, 06 Mar 2025 10:25:17 GMT
- Title: KAGNNs: Kolmogorov-Arnold Networks meet Graph Learning
- Authors: Roman Bresson, Giannis Nikolentzos, George Panagopoulos, Michail Chatzianastasis, Jun Pang, Michalis Vazirgiannis,
- Abstract summary: Graph Neural Networks (GNNs) have become the de facto tool for learning node and graph representations.<n>We implement three new KAN-based GNN layers inspired respectively by the GCN, GAT and GIN layers.<n>Our results indicate that KANs are on-par with or better than Arnolds on all tasks studied in this paper.
- Score: 27.638009679134523
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In recent years, Graph Neural Networks (GNNs) have become the de facto tool for learning node and graph representations. Most GNNs typically consist of a sequence of neighborhood aggregation (a.k.a., message-passing) layers, within which the representation of each node is updated based on those of its neighbors. The most expressive message-passing GNNs can be obtained through the use of the sum aggregator and of MLPs for feature transformation, thanks to their universal approximation capabilities. However, the limitations of MLPs recently motivated the introduction of another family of universal approximators, called Kolmogorov-Arnold Networks (KANs) which rely on a different representation theorem. In this work, we compare the performance of KANs against that of MLPs on graph learning tasks. We implement three new KAN-based GNN layers, inspired respectively by the GCN, GAT and GIN layers. We evaluate two different implementations of KANs using two distinct base families of functions, namely B-splines and radial basis functions. We perform extensive experiments on node classification, link prediction, graph classification and graph regression datasets. Our results indicate that KANs are on-par with or better than MLPs on all tasks studied in this paper. We also show that the size and training speed of RBF-based KANs is only marginally higher than for MLPs, making them viable alternatives. Code available at https://github.com/RomanBresson/KAGNN.
Related papers
- VQGraph: Rethinking Graph Representation Space for Bridging GNNs and
MLPs [97.63412451659826]
VQGraph learns a structure-aware tokenizer on graph data that can encode each node's local substructure as a discrete code.
VQGraph achieves new state-of-the-art performance on GNN-to-MLP distillation in both transductive and inductive settings.
arXiv Detail & Related papers (2023-08-04T02:58:08Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Learnable Graph Convolutional Attention Networks [7.465923786151107]
Graph Neural Networks (GNNs) compute the message exchange between nodes by either aggregating uniformly (convolving) the features of all the neighboring nodes, or by applying a non-uniform score (attending) to the features.
Recent works have shown the strengths and weaknesses of the resulting GNN architectures, respectively, GCNs and GATs.
We introduce the graph convolutional attention layer (CAT), which relies on convolutions to compute the attention scores.
Our results demonstrate that L-CAT is able to efficiently combine different GNN layers along the network, outperforming competing methods in a wide
arXiv Detail & Related papers (2022-11-21T21:08:58Z) - Neighborhood Convolutional Network: A New Paradigm of Graph Neural
Networks for Node Classification [12.062421384484812]
Graph Convolutional Network (GCN) decouples neighborhood aggregation and feature transformation in each convolutional layer.
In this paper, we propose a new paradigm of GCN, termed Neighborhood Convolutional Network (NCN)
In this way, the model could inherit the merit of decoupled GCN for aggregating neighborhood information, at the same time, develop much more powerful feature learning modules.
arXiv Detail & Related papers (2022-11-15T02:02:51Z) - Weisfeiler and Leman go Hyperbolic: Learning Distance Preserving Node
Representations [26.77596449192451]
Graph neural networks (GNNs) have emerged as a promising tool for solving machine learning problems on graphs.
In this paper, we define a distance function between nodes which is based on the hierarchy produced by the Weisfeiler-Leman (WL) algorithm.
We propose a model that learns representations which preserve those distances between nodes.
arXiv Detail & Related papers (2022-11-04T15:03:41Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Incorporating Heterophily into Graph Neural Networks for Graph Classification [6.709862924279403]
Graph Neural Networks (GNNs) often assume strong homophily for graph classification, seldom considering heterophily.
We develop a novel GNN architecture called IHGNN (short for Incorporating Heterophily into Graph Neural Networks)
We empirically validate IHGNN on various graph datasets and demonstrate that it outperforms the state-of-the-art GNNs for graph classification.
arXiv Detail & Related papers (2022-03-15T06:48:35Z) - Graph Neural Networks with Feature and Structure Aware Random Walk [7.143879014059894]
We show that in typical heterphilous graphs, the edges may be directed, and whether to treat the edges as is or simply make them undirected greatly affects the performance of the GNN models.
We develop a model that adaptively learns the directionality of the graph, and exploits the underlying long-distance correlations between nodes.
arXiv Detail & Related papers (2021-11-19T08:54:21Z) - On the approximation capability of GNNs in node
classification/regression tasks [4.141514895639094]
Graph Neural Networks (GNNs) are a broad class of connectionist models for graph processing.
We show that GNNs are universal approximators in probability for node classification/regression tasks.
arXiv Detail & Related papers (2021-06-16T17:46:51Z) - Scalable Graph Neural Networks for Heterogeneous Graphs [12.44278942365518]
Graph neural networks (GNNs) are a popular class of parametric model for learning over graph-structured data.
Recent work has argued that GNNs primarily use the graph for feature smoothing, and have shown competitive results on benchmark tasks.
In this work, we ask whether these results can be extended to heterogeneous graphs, which encode multiple types of relationship between different entities.
arXiv Detail & Related papers (2020-11-19T06:03:35Z) - On Graph Neural Networks versus Graph-Augmented MLPs [51.23890789522705]
Graph-Augmented Multi-Layer Perceptrons (GA-MLPs) first augments node features with certain multi-hop operators on the graph.
We prove a separation in expressive power between GA-MLPs and GNNs that grows exponentially in depth.
arXiv Detail & Related papers (2020-10-28T17:59:59Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - CatGCN: Graph Convolutional Networks with Categorical Node Features [99.555850712725]
CatGCN is tailored for graph learning when the node features are categorical.
We train CatGCN in an end-to-end fashion and demonstrate it on semi-supervised node classification.
arXiv Detail & Related papers (2020-09-11T09:25:17Z) - Distance Encoding: Design Provably More Powerful Neural Networks for
Graph Representation Learning [63.97983530843762]
Graph Neural Networks (GNNs) have achieved great success in graph representation learning.
GNNs generate identical representations for graph substructures that may in fact be very different.
More powerful GNNs, proposed recently by mimicking higher-order tests, are inefficient as they cannot sparsity of underlying graph structure.
We propose Distance Depiction (DE) as a new class of graph representation learning.
arXiv Detail & Related papers (2020-08-31T23:15:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.