Effectiveness of the Recent Advances in Capsule Networks
- URL: http://arxiv.org/abs/2210.05834v1
- Date: Tue, 11 Oct 2022 23:30:12 GMT
- Title: Effectiveness of the Recent Advances in Capsule Networks
- Authors: Nidhin Harilal, Rohan Patil
- Abstract summary: Convolutional neural networks (CNNs) have revolutionized the field of deep neural networks.
Recent research has shown that CNNs fail to generalize under various conditions.
The idea of capsules was introduced in 2011, though the real surge of research started from 2017.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Convolutional neural networks (CNNs) have revolutionized the field of deep
neural networks. However, recent research has shown that CNNs fail to
generalize under various conditions and hence the idea of capsules was
introduced in 2011, though the real surge of research started from 2017. In
this paper, we present an overview of the recent advances in capsule
architecture and routing mechanisms. In addition, we find that the relative
focus in recent literature is on modifying routing procedure or architecture as
a whole but the study of other finer components, specifically, squash function
is wanting. Thus, we also present some new insights regarding the effect of
squash functions in performance of the capsule networks. Finally, we conclude
by discussing and proposing possible opportunities in the field of capsule
networks.
Related papers
- Hierarchical Object-Centric Learning with Capsule Networks [0.0]
Capsule networks (CapsNets) were introduced to address convolutional neural networks limitations.
This thesis investigates the intriguing aspects of CapsNets and focuses on three key questions to unlock their full potential.
arXiv Detail & Related papers (2024-05-30T09:10:33Z) - Vanishing Activations: A Symptom of Deep Capsule Networks [10.046549855562123]
Capsule Networks are an extension to Neural Networks utilizing vector or matrix representations instead of scalars.
Early implementations of Capsule Networks achieved and maintain state-of-the-art results on various datasets.
Recent studies have revealed shortcomings in the original Capsule Network architecture.
arXiv Detail & Related papers (2023-05-13T15:42:26Z) - Learning with Capsules: A Survey [73.31150426300198]
Capsule networks were proposed as an alternative approach to Convolutional Neural Networks (CNNs) for learning object-centric representations.
Unlike CNNs, capsule networks are designed to explicitly model part-whole hierarchical relationships.
arXiv Detail & Related papers (2022-06-06T15:05:36Z) - SAR Despeckling Using Overcomplete Convolutional Networks [53.99620005035804]
despeckling is an important problem in remote sensing as speckle degrades SAR images.
Recent studies show that convolutional neural networks(CNNs) outperform classical despeckling methods.
This study employs an overcomplete CNN architecture to focus on learning low-level features by restricting the receptive field.
We show that the proposed network improves despeckling performance compared to recent despeckling methods on synthetic and real SAR images.
arXiv Detail & Related papers (2022-05-31T15:55:37Z) - Emerging Paradigms of Neural Network Pruning [82.9322109208353]
Pruning is adopted as a post-processing solution to this problem, which aims to remove unnecessary parameters in a neural network with little performance compromised.
Recent works challenge this belief by discovering random sparse networks which can be trained to match the performance with their dense counterpart.
This survey seeks to bridge the gap by proposing a general pruning framework so that the emerging pruning paradigms can be accommodated well with the traditional one.
arXiv Detail & Related papers (2021-03-11T05:01:52Z) - Neural Networks Enhancement with Logical Knowledge [83.9217787335878]
We propose an extension of KENN for relational data.
The results show that KENN is capable of increasing the performances of the underlying neural network even in the presence relational data.
arXiv Detail & Related papers (2020-09-13T21:12:20Z) - Wasserstein Routed Capsule Networks [90.16542156512405]
We propose a new parameter efficient capsule architecture, that is able to tackle complex tasks.
We show that our network is able to substantially outperform other capsule approaches by over 1.2 % on CIFAR-10.
arXiv Detail & Related papers (2020-07-22T14:38:05Z) - Expressivity of Deep Neural Networks [2.7909470193274593]
In this review paper, we give a comprehensive overview of the large variety of approximation results for neural networks.
While the mainbody of existing results is for general feedforward architectures, we also depict approximation results for convolutional, residual and recurrent neural networks.
arXiv Detail & Related papers (2020-07-09T13:08:01Z) - Subspace Capsule Network [85.69796543499021]
SubSpace Capsule Network (SCN) exploits the idea of capsule networks to model possible variations in the appearance or implicitly defined properties of an entity.
SCN can be applied to both discriminative and generative models without incurring computational overhead compared to CNN during test time.
arXiv Detail & Related papers (2020-02-07T17:51:56Z) - Examining the Benefits of Capsule Neural Networks [9.658250977094562]
Capsule networks are a newly developed class of neural networks that potentially address some of the deficiencies with traditional convolutional neural networks.
By replacing the standard scalar activations with vectors, capsule networks aim to be the next great development for computer vision applications.
arXiv Detail & Related papers (2020-01-29T17:18:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.