Mitigating Vanishing Activations in Deep CapsNets Using Channel Pruning
- URL: http://arxiv.org/abs/2410.16908v1
- Date: Tue, 22 Oct 2024 11:28:39 GMT
- Title: Mitigating Vanishing Activations in Deep CapsNets Using Channel Pruning
- Authors: Siddharth Sahu, Abdulrahman Altahhan,
- Abstract summary: Capsule Networks outperform Convolutional Neural Networks in learning the part-whole relationships with viewpoint invariance.
It was assumed that increasing the number of capsule layers in the capsule networks would enhance the model performance.
Recent studies found that Capsule Networks lack scalability due to vanishing activations in the capsules of deeper layers.
- Score: 0.0
- License:
- Abstract: Capsule Networks outperform Convolutional Neural Networks in learning the part-whole relationships with viewpoint invariance, and the credit goes to their multidimensional capsules. It was assumed that increasing the number of capsule layers in the capsule networks would enhance the model performance. However, recent studies found that Capsule Networks lack scalability due to vanishing activations in the capsules of deeper layers. This paper thoroughly investigates the vanishing activation problem in deep Capsule Networks. To analyze this issue and understand how increasing capsule dimensions can facilitate deeper networks, various Capsule Network models are constructed and evaluated with different numbers of capsules, capsule dimensions, and intermediate layers for this paper. Unlike traditional model pruning, which reduces the number of model parameters and expedites model training, this study uses pruning to mitigate the vanishing activations in the deeper capsule layers. In addition, the backbone network and capsule layers are pruned with different pruning ratios to reduce the number of inactive capsules and achieve better model accuracy than the unpruned models.
Related papers
- Mamba Capsule Routing Towards Part-Whole Relational Camouflaged Object Detection [98.6460229237143]
We propose a novel mamba capsule routing at the type level.
These type-level mamba capsules are fed into the EM routing algorithm to get the high-layer mamba capsules.
On top of that, to retrieve the pixel-level capsule features for further camouflaged prediction, we achieve this on the basis of the low-layer pixel-level capsules.
arXiv Detail & Related papers (2024-10-05T00:20:22Z) - Hierarchical Object-Centric Learning with Capsule Networks [0.0]
Capsule networks (CapsNets) were introduced to address convolutional neural networks limitations.
This thesis investigates the intriguing aspects of CapsNets and focuses on three key questions to unlock their full potential.
arXiv Detail & Related papers (2024-05-30T09:10:33Z) - Deep multi-prototype capsule networks [0.3823356975862005]
Capsule networks are a type of neural network that identify image parts and form the instantiation parameters of a whole hierarchically.
This paper presents a multi-prototype architecture for guiding capsule networks to represent the variations in the image parts.
The experimental results on MNIST, SVHN, C-Cube, CEDAR, MCYT, and UTSig datasets reveal that the proposed model outperforms others regarding image classification accuracy.
arXiv Detail & Related papers (2024-04-23T18:37:37Z) - Vanishing Activations: A Symptom of Deep Capsule Networks [10.046549855562123]
Capsule Networks are an extension to Neural Networks utilizing vector or matrix representations instead of scalars.
Early implementations of Capsule Networks achieved and maintain state-of-the-art results on various datasets.
Recent studies have revealed shortcomings in the original Capsule Network architecture.
arXiv Detail & Related papers (2023-05-13T15:42:26Z) - Towards Efficient Capsule Networks [7.1577508803778045]
Capsule Networks were introduced to enhance explainability of a model, where each capsule is an explicit representation of an object or its parts.
We show how pruning with Capsule Network achieves high generalization with less memory requirements, computational effort, and inference and training time.
arXiv Detail & Related papers (2022-08-19T08:03:25Z) - Learning with Capsules: A Survey [73.31150426300198]
Capsule networks were proposed as an alternative approach to Convolutional Neural Networks (CNNs) for learning object-centric representations.
Unlike CNNs, capsule networks are designed to explicitly model part-whole hierarchical relationships.
arXiv Detail & Related papers (2022-06-06T15:05:36Z) - HP-Capsule: Unsupervised Face Part Discovery by Hierarchical Parsing
Capsule Network [76.92310948325847]
We propose a Hierarchical Parsing Capsule Network (HP-Capsule) for unsupervised face subpart-part discovery.
HP-Capsule extends the application of capsule networks from digits to human faces and takes a step forward to show how the neural networks understand objects without human intervention.
arXiv Detail & Related papers (2022-03-21T01:39:41Z) - Routing with Self-Attention for Multimodal Capsule Networks [108.85007719132618]
We present a new multimodal capsule network that allows us to leverage the strength of capsules in the context of a multimodal learning framework.
To adapt the capsules to large-scale input data, we propose a novel routing by self-attention mechanism that selects relevant capsules.
This allows not only for robust training with noisy video data, but also to scale up the size of the capsule network compared to traditional routing methods.
arXiv Detail & Related papers (2021-12-01T19:01:26Z) - Training Deep Capsule Networks with Residual Connections [0.0]
Capsule networks are a type of neural network that have recently gained increased popularity.
They consist of groups of neurons, called capsules, which encode properties of objects or object parts.
Most capsule network implementations use two to three capsule layers, which limits their applicability as expressivity grows exponentially with depth.
We propose a methodology to train deeper capsule networks using residual connections, which is evaluated on four datasets and three different routing algorithms.
Our experimental results show that in fact, performance increases when training deeper capsule networks.
arXiv Detail & Related papers (2021-04-15T11:42:44Z) - Wasserstein Routed Capsule Networks [90.16542156512405]
We propose a new parameter efficient capsule architecture, that is able to tackle complex tasks.
We show that our network is able to substantially outperform other capsule approaches by over 1.2 % on CIFAR-10.
arXiv Detail & Related papers (2020-07-22T14:38:05Z) - Subspace Capsule Network [85.69796543499021]
SubSpace Capsule Network (SCN) exploits the idea of capsule networks to model possible variations in the appearance or implicitly defined properties of an entity.
SCN can be applied to both discriminative and generative models without incurring computational overhead compared to CNN during test time.
arXiv Detail & Related papers (2020-02-07T17:51:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.