Examining the Benefits of Capsule Neural Networks
- URL: http://arxiv.org/abs/2001.10964v1
- Date: Wed, 29 Jan 2020 17:18:43 GMT
- Title: Examining the Benefits of Capsule Neural Networks
- Authors: Arjun Punjabi, Jonas Schmid, Aggelos K. Katsaggelos
- Abstract summary: Capsule networks are a newly developed class of neural networks that potentially address some of the deficiencies with traditional convolutional neural networks.
By replacing the standard scalar activations with vectors, capsule networks aim to be the next great development for computer vision applications.
- Score: 9.658250977094562
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Capsule networks are a recently developed class of neural networks that
potentially address some of the deficiencies with traditional convolutional
neural networks. By replacing the standard scalar activations with vectors, and
by connecting the artificial neurons in a new way, capsule networks aim to be
the next great development for computer vision applications. However, in order
to determine whether these networks truly operate differently than traditional
networks, one must look at the differences in the capsule features. To this
end, we perform several analyses with the purpose of elucidating capsule
features and determining whether they perform as described in the initial
publication. First, we perform a deep visualization analysis to visually
compare capsule features and convolutional neural network features. Then, we
look at the ability for capsule features to encode information across the
vector components and address what changes in the capsule architecture provides
the most benefit. Finally, we look at how well the capsule features are able to
encode instantiation parameters of class objects via visual transformations.
Related papers
- Coding schemes in neural networks learning classification tasks [52.22978725954347]
We investigate fully-connected, wide neural networks learning classification tasks.
We show that the networks acquire strong, data-dependent features.
Surprisingly, the nature of the internal representations depends crucially on the neuronal nonlinearity.
arXiv Detail & Related papers (2024-06-24T14:50:05Z) - Hierarchical Object-Centric Learning with Capsule Networks [0.0]
Capsule networks (CapsNets) were introduced to address convolutional neural networks limitations.
This thesis investigates the intriguing aspects of CapsNets and focuses on three key questions to unlock their full potential.
arXiv Detail & Related papers (2024-05-30T09:10:33Z) - ShadowNet for Data-Centric Quantum System Learning [188.683909185536]
We propose a data-centric learning paradigm combining the strength of neural-network protocols and classical shadows.
Capitalizing on the generalization power of neural networks, this paradigm can be trained offline and excel at predicting previously unseen systems.
We present the instantiation of our paradigm in quantum state tomography and direct fidelity estimation tasks and conduct numerical analysis up to 60 qubits.
arXiv Detail & Related papers (2023-08-22T09:11:53Z) - Why Capsule Neural Networks Do Not Scale: Challenging the Dynamic
Parse-Tree Assumption [16.223322939363033]
Capsule neural networks replace simple, scalar-valued neurons with vector-valued capsules.
CapsNet is the first actual implementation of the conceptual idea of capsule neural networks.
No work was able to scale the CapsNet architecture to more reasonable-sized datasets.
arXiv Detail & Related papers (2023-01-04T12:59:51Z) - Effectiveness of the Recent Advances in Capsule Networks [0.0]
Convolutional neural networks (CNNs) have revolutionized the field of deep neural networks.
Recent research has shown that CNNs fail to generalize under various conditions.
The idea of capsules was introduced in 2011, though the real surge of research started from 2017.
arXiv Detail & Related papers (2022-10-11T23:30:12Z) - Learning with Capsules: A Survey [73.31150426300198]
Capsule networks were proposed as an alternative approach to Convolutional Neural Networks (CNNs) for learning object-centric representations.
Unlike CNNs, capsule networks are designed to explicitly model part-whole hierarchical relationships.
arXiv Detail & Related papers (2022-06-06T15:05:36Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Learning Connectivity of Neural Networks from a Topological Perspective [80.35103711638548]
We propose a topological perspective to represent a network into a complete graph for analysis.
By assigning learnable parameters to the edges which reflect the magnitude of connections, the learning process can be performed in a differentiable manner.
This learning process is compatible with existing networks and owns adaptability to larger search spaces and different tasks.
arXiv Detail & Related papers (2020-08-19T04:53:31Z) - Subspace Capsule Network [85.69796543499021]
SubSpace Capsule Network (SCN) exploits the idea of capsule networks to model possible variations in the appearance or implicitly defined properties of an entity.
SCN can be applied to both discriminative and generative models without incurring computational overhead compared to CNN during test time.
arXiv Detail & Related papers (2020-02-07T17:51:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.