Lightweight Graph Convolutional Networks with Topologically Consistent
Magnitude Pruning
- URL: http://arxiv.org/abs/2203.13616v1
- Date: Fri, 25 Mar 2022 12:34:11 GMT
- Title: Lightweight Graph Convolutional Networks with Topologically Consistent
Magnitude Pruning
- Authors: Hichem Sahbi
- Abstract summary: Graph convolution networks (GCNs) are currently mainstream in learning with irregular data.
In this paper, we devise a novel method for lightweight GCN design.
Our proposed approach parses and selectsworks with the highest magnitudes while guaranteeing their topological consistency.
- Score: 12.18340575383456
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph convolution networks (GCNs) are currently mainstream in learning with
irregular data. These models rely on message passing and attention mechanisms
that capture context and node-to-node relationships. With multi-head attention,
GCNs become highly accurate but oversized, and their deployment on cheap
devices requires their pruning. However, pruning at high regimes usually leads
to topologically inconsistent networks with weak generalization. In this paper,
we devise a novel method for lightweight GCN design. Our proposed approach
parses and selects subnetworks with the highest magnitudes while guaranteeing
their topological consistency. The latter is obtained by selecting only
accessible and co-accessible connections which actually contribute in the
evaluation of the selected subnetworks. Experiments conducted on the
challenging FPHA dataset show the substantial gain of our topologically
consistent pruning method especially at very high pruning regimes.
Related papers
- Applying Self-supervised Learning to Network Intrusion Detection for
Network Flows with Graph Neural Network [8.318363497010969]
This paper studies the application of GNNs to identify the specific types of network flows in an unsupervised manner.
To the best of our knowledge, it is the first GNN-based self-supervised method for the multiclass classification of network flows in NIDS.
arXiv Detail & Related papers (2024-03-03T12:34:13Z) - Miniaturized Graph Convolutional Networks with Topologically Consistent
Pruning [12.18340575383456]
We devise a novel magnitude pruning method that allows extractingworks while guarantying their topological consistency.
The latter ensures that only accessible and co-accessible connections are kept in the resulting lightweight networks.
Our solution is based on a novel reparametrization and two supervisory bi-directional networks which implement accessibility/co-accessibility.
arXiv Detail & Related papers (2023-06-30T12:09:22Z) - Budget-Aware Graph Convolutional Network Design using Probabilistic
Magnitude Pruning [12.18340575383456]
We devise a novel lightweight Graph convolutional networks (GCNs) design dubbed as Probabilistic Magnitude Pruning (PMP)
Our method is variational and proceeds by aligning the weight distribution of the learned networks with a priori distribution.
Experiments conducted on the challenging task of skeleton-based recognition show a substantial gain of our lightweight GCNs.
arXiv Detail & Related papers (2023-05-30T18:12:13Z) - Revisiting Heterophily For Graph Neural Networks [42.41238892727136]
Graph Neural Networks (GNNs) extend basic Neural Networks (NNs) by using graph structures based on the relational inductive bias (homophily assumption)
Recent work has identified a non-trivial set of datasets where their performance compared to NNs is not satisfactory.
arXiv Detail & Related papers (2022-10-14T08:00:26Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - Dimensionality Reduction Meets Message Passing for Graph Node Embeddings [0.0]
We propose PCAPass, a method which combines Principal Component Analysis (PCA) and message passing for generating node embeddings in an unsupervised manner.
We show empirically that this approach provides competitive performance compared to popular GNNs on node classification benchmarks.
Our research demonstrates that applying dimensionality reduction with message passing and skip connections is a promising mechanism for aggregating long-range dependencies in graph structured data.
arXiv Detail & Related papers (2022-02-01T13:39:00Z) - AM-GCN: Adaptive Multi-channel Graph Convolutional Networks [85.0332394224503]
We study whether Graph Convolutional Networks (GCNs) can optimally integrate node features and topological structures in a complex graph with rich information.
We propose an adaptive multi-channel graph convolutional networks for semi-supervised classification (AM-GCN)
Our experiments show that AM-GCN extracts the most correlated information from both node features and topological structures substantially.
arXiv Detail & Related papers (2020-07-05T08:16:03Z) - Graph Prototypical Networks for Few-shot Learning on Attributed Networks [72.31180045017835]
We propose a graph meta-learning framework -- Graph Prototypical Networks (GPN)
GPN is able to perform textitmeta-learning on an attributed network and derive a highly generalizable model for handling the target classification task.
arXiv Detail & Related papers (2020-06-23T04:13:23Z) - Graph Highway Networks [77.38665506495553]
Graph Convolution Networks (GCN) are widely used in learning graph representations due to their effectiveness and efficiency.
They suffer from the notorious over-smoothing problem, in which the learned representations converge to alike vectors when many layers are stacked.
We propose Graph Highway Networks (GHNet) which utilize gating units to balance the trade-off between homogeneity and heterogeneity in the GCN learning process.
arXiv Detail & Related papers (2020-04-09T16:26:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.