Scalable Graph Compressed Convolutions
- URL: http://arxiv.org/abs/2407.18480v2
- Date: Thu, 31 Oct 2024 04:58:09 GMT
- Title: Scalable Graph Compressed Convolutions
- Authors: Junshu Sun, Shuhui Wang, Chenxue Yang, Qingming Huang,
- Abstract summary: We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
- Score: 68.85227170390864
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Designing effective graph neural networks (GNNs) with message passing has two fundamental challenges, i.e., determining optimal message-passing pathways and designing local aggregators. Previous methods of designing optimal pathways are limited with information loss on the input features. On the other hand, existing local aggregators generally fail to extract multi-scale features and approximate diverse operators under limited parameter scales. In contrast to these methods, Euclidean convolution has been proven as an expressive aggregator, making it a perfect candidate for GNN construction. However, the challenges of generalizing Euclidean convolution to graphs arise from the irregular structure of graphs. To bridge the gap between Euclidean space and graph topology, we propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution. The permutations constrain all nodes in a row regardless of their input order and therefore enable the flexible generalization of Euclidean convolution to graphs. Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning. CoCN follows local feature-learning and global parameter-sharing mechanisms of convolution neural networks. The whole model can be trained end-to-end, with compressed convolution applied to learn individual node features and their corresponding structure features. CoCN can further borrow successful practices from Euclidean convolution, including residual connection and inception mechanism. We validate CoCN on both node-level and graph-level benchmarks. CoCN achieves superior performance over competitive GNN baselines. Codes are available at https://github.com/sunjss/CoCN.
Related papers
- L^2GC:Lorentzian Linear Graph Convolutional Networks for Node Classification [12.69417276887153]
We propose a novel framework for Lorentzian linear GCN.
We map the learned features of graph nodes into hyperbolic space.
We then perform a Lorentzian linear feature transformation to capture the underlying tree-like structure of data.
arXiv Detail & Related papers (2024-03-10T02:16:13Z) - T-GAE: Transferable Graph Autoencoder for Network Alignment [79.89704126746204]
T-GAE is a graph autoencoder framework that leverages transferability and stability of GNNs to achieve efficient network alignment without retraining.
Our experiments demonstrate that T-GAE outperforms the state-of-the-art optimization method and the best GNN approach by up to 38.7% and 50.8%, respectively.
arXiv Detail & Related papers (2023-10-05T02:58:29Z) - Learning Adaptive Neighborhoods for Graph Neural Networks [45.94778766867247]
Graph convolutional networks (GCNs) enable end-to-end learning on graph structured data.
We propose a novel end-to-end differentiable graph generator which builds graph topologies.
Our module can be readily integrated into existing pipelines involving graph convolution operations.
arXiv Detail & Related papers (2023-07-18T08:37:25Z) - NodeFormer: A Scalable Graph Structure Learning Transformer for Node
Classification [70.51126383984555]
We introduce a novel all-pair message passing scheme for efficiently propagating node signals between arbitrary nodes.
The efficient computation is enabled by a kernerlized Gumbel-Softmax operator.
Experiments demonstrate the promising efficacy of the method in various tasks including node classification on graphs.
arXiv Detail & Related papers (2023-06-14T09:21:15Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Deformable Graph Convolutional Networks [12.857403315970231]
Graph neural networks (GNNs) have significantly improved representation power for graph-structured data.
In this paper, we propose Deformable Graph Convolutional Networks (Deformable GCNs) that adaptively perform convolution in multiple latent spaces.
Our framework simultaneously learns the node positional embeddings to determine the relations between nodes in an end-to-end fashion.
arXiv Detail & Related papers (2021-12-29T07:55:29Z) - ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network [72.16255675586089]
We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
arXiv Detail & Related papers (2021-10-15T07:18:57Z) - Action Recognition with Kernel-based Graph Convolutional Networks [14.924672048447338]
Learning graph convolutional networks (GCNs) aims at generalizing deep learning to arbitrary non-regular domains.
We introduce a novel GCN framework that achieves spatial graph convolution in a reproducing kernel Hilbert space (RKHS)
The particularity of our GCN model also resides in its ability to achieve convolutions without explicitly realigning nodes in the receptive fields of the learned graph filters with those of the input graphs.
arXiv Detail & Related papers (2020-12-28T11:02:51Z) - Permutation-equivariant and Proximity-aware Graph Neural Networks with
Stochastic Message Passing [88.30867628592112]
Graph neural networks (GNNs) are emerging machine learning models on graphs.
Permutation-equivariance and proximity-awareness are two important properties highly desirable for GNNs.
We show that existing GNNs, mostly based on the message-passing mechanism, cannot simultaneously preserve the two properties.
In order to preserve node proximities, we augment the existing GNNs with node representations.
arXiv Detail & Related papers (2020-09-05T16:46:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.