Miniaturized Graph Convolutional Networks with Topologically Consistent
Pruning
- URL: http://arxiv.org/abs/2306.17590v1
- Date: Fri, 30 Jun 2023 12:09:22 GMT
- Title: Miniaturized Graph Convolutional Networks with Topologically Consistent
Pruning
- Authors: Hichem Sahbi
- Abstract summary: We devise a novel magnitude pruning method that allows extractingworks while guarantying their topological consistency.
The latter ensures that only accessible and co-accessible connections are kept in the resulting lightweight networks.
Our solution is based on a novel reparametrization and two supervisory bi-directional networks which implement accessibility/co-accessibility.
- Score: 12.18340575383456
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Magnitude pruning is one of the mainstream methods in lightweight
architecture design whose goal is to extract subnetworks with the largest
weight connections. This method is known to be successful, but under very high
pruning regimes, it suffers from topological inconsistency which renders the
extracted subnetworks disconnected, and this hinders their generalization
ability. In this paper, we devise a novel magnitude pruning method that allows
extracting subnetworks while guarantying their topological consistency. The
latter ensures that only accessible and co-accessible -- impactful --
connections are kept in the resulting lightweight networks. Our solution is
based on a novel reparametrization and two supervisory bi-directional networks
which implement accessibility/co-accessibility and guarantee that only
connected subnetworks will be selected during training. This solution allows
enhancing generalization significantly, under very high pruning regimes, as
corroborated through extensive experiments, involving graph convolutional
networks, on the challenging task of skeleton-based action recognition.
Related papers
- A Multi-objective Complex Network Pruning Framework Based on
Divide-and-conquer and Global Performance Impairment Ranking [40.59001171151929]
A multi-objective complex network pruning framework based on divide-and-conquer and global performance impairment ranking is proposed in this paper.
The proposed algorithm achieves a comparable performance with the state-of-the-art pruning methods.
arXiv Detail & Related papers (2023-03-28T12:05:15Z) - Lightweight Graph Convolutional Networks with Topologically Consistent
Magnitude Pruning [12.18340575383456]
Graph convolution networks (GCNs) are currently mainstream in learning with irregular data.
In this paper, we devise a novel method for lightweight GCN design.
Our proposed approach parses and selectsworks with the highest magnitudes while guaranteeing their topological consistency.
arXiv Detail & Related papers (2022-03-25T12:34:11Z) - Cascaded Compressed Sensing Networks: A Reversible Architecture for
Layerwise Learning [11.721183551822097]
We show that target propagation could be achieved by modeling the network s each layer with compressed sensing, without the need of auxiliary networks.
Experiments show that the proposed method could achieve better performance than the auxiliary network-based method.
arXiv Detail & Related papers (2021-10-20T05:21:13Z) - Unsupervised Domain-adaptive Hash for Networks [81.49184987430333]
Domain-adaptive hash learning has enjoyed considerable success in the computer vision community.
We develop an unsupervised domain-adaptive hash learning method for networks, dubbed UDAH.
arXiv Detail & Related papers (2021-08-20T12:09:38Z) - Manifold Regularized Dynamic Network Pruning [102.24146031250034]
This paper proposes a new paradigm that dynamically removes redundant filters by embedding the manifold information of all instances into the space of pruned networks.
The effectiveness of the proposed method is verified on several benchmarks, which shows better performance in terms of both accuracy and computational cost.
arXiv Detail & Related papers (2021-03-10T03:59:03Z) - Anomaly Detection on Attributed Networks via Contrastive Self-Supervised
Learning [50.24174211654775]
We present a novel contrastive self-supervised learning framework for anomaly detection on attributed networks.
Our framework fully exploits the local information from network data by sampling a novel type of contrastive instance pair.
A graph neural network-based contrastive learning model is proposed to learn informative embedding from high-dimensional attributes and local structure.
arXiv Detail & Related papers (2021-02-27T03:17:20Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - ESPN: Extremely Sparse Pruned Networks [50.436905934791035]
We show that a simple iterative mask discovery method can achieve state-of-the-art compression of very deep networks.
Our algorithm represents a hybrid approach between single shot network pruning methods and Lottery-Ticket type approaches.
arXiv Detail & Related papers (2020-06-28T23:09:27Z) - Deep hierarchical pooling design for cross-granularity action
recognition [14.696233190562936]
We introduce a novel hierarchical aggregation design that captures different levels of temporal granularity in action recognition.
Learning the combination of operations in this network -- which best fits a given ground-truth -- is obtained by solving a constrained minimization problem.
Besides being principled and well grounded, the proposed hierarchical pooling is also video-length and resilient to misalignments in actions.
arXiv Detail & Related papers (2020-06-08T11:03:54Z) - Compact Neural Representation Using Attentive Network Pruning [1.0152838128195465]
We describe a Top-Down attention mechanism that is added to a Bottom-Up feedforward network to select important connections and subsequently prune redundant ones at all parametric layers.
Our method not only introduces a novel hierarchical selection mechanism as the basis of pruning but also remains competitive with previous baseline methods in the experimental evaluation.
arXiv Detail & Related papers (2020-05-10T03:20:01Z) - Fitting the Search Space of Weight-sharing NAS with Graph Convolutional
Networks [100.14670789581811]
We train a graph convolutional network to fit the performance of sampled sub-networks.
With this strategy, we achieve a higher rank correlation coefficient in the selected set of candidates.
arXiv Detail & Related papers (2020-04-17T19:12:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.