Convolutional Fully-Connected Capsule Network (CFC-CapsNet): A Novel and Fast Capsule Network
- URL: http://arxiv.org/abs/2511.05617v1
- Date: Thu, 06 Nov 2025 19:27:15 GMT
- Title: Convolutional Fully-Connected Capsule Network (CFC-CapsNet): A Novel and Fast Capsule Network
- Authors: Pouya Shiri, Amirali Baniasadi,
- Abstract summary: We introduce Convolutional Fully-Connected Capsule Network (CFC-CapsNet) to address the shortcomings of CapsNet.<n>CFC-CapsNet produces fewer, yet more powerful capsules resulting in higher network accuracy.<n>Our experiments show that CFC-CapsNet achieves competitive accuracy, faster training and inference.
- Score: 0.07161783472741746
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: A Capsule Network (CapsNet) is a relatively new classifier and one of the possible successors of Convolutional Neural Networks (CNNs). CapsNet maintains the spatial hierarchies between the features and outperforms CNNs at classifying images including overlapping categories. Even though CapsNet works well on small-scale datasets such as MNIST, it fails to achieve a similar level of performance on more complicated datasets and real applications. In addition, CapsNet is slow compared to CNNs when performing the same task and relies on a higher number of parameters. In this work, we introduce Convolutional Fully-Connected Capsule Network (CFC-CapsNet) to address the shortcomings of CapsNet by creating capsules using a different method. We introduce a new layer (CFC layer) as an alternative solution to creating capsules. CFC-CapsNet produces fewer, yet more powerful capsules resulting in higher network accuracy. Our experiments show that CFC-CapsNet achieves competitive accuracy, faster training and inference and uses less number of parameters on the CIFAR-10, SVHN and Fashion-MNIST datasets compared to conventional CapsNet.
Related papers
- PrunedCaps: A Case For Primary Capsules Discrimination [0.06372261626436675]
We show that a pruned version of CapsNet performs up to 9.90 times faster than the conventional architecture.<n>Our pruned architecture saves on more than 95.36 percent of floating-point operations in the dynamic routing stage of the architecture.
arXiv Detail & Related papers (2025-12-02T04:31:58Z) - DL-CapsNet: A Deep and Light Capsule Network [0.07161783472741746]
We propose a deep variant of CapsNet consisting of several capsule layers.<n>DL-CapsNet, while being highly accurate, employs a small number of parameters and delivers faster training and inference.
arXiv Detail & Related papers (2025-11-23T05:45:11Z) - LE-CapsNet: A Light and Enhanced Capsule Network [0.07161783472741746]
Capsule Network (CapsNet) has several advantages over CNNs.<n>CapsNet is slow due to its different structure.<n>We propose LE-CapsNet as a light, enhanced and more accurate variant of CapsNet.
arXiv Detail & Related papers (2025-11-12T15:45:48Z) - Quick-CapsNet (QCN): A fast alternative to Capsule Networks [0.06372261626436675]
We introduce Quick-CapsNet (QCN) as a fast alternative to CapsNet.<n>QCN builds on producing a fewer number of capsules, which results in a faster network.<n>Inference is 5x faster on MNIST, F-MNIST, SVHN and Cifar-10 datasets.
arXiv Detail & Related papers (2025-10-08T22:41:28Z) - Return of ChebNet: Understanding and Improving an Overlooked GNN on Long Range Tasks [53.974190296524455]
We revisit ChebNet to shed light on its ability to model distant node interactions.<n>We cast ChebNet as a stable and non-dissipative dynamical system, which we coin Stable-ChebNet.
arXiv Detail & Related papers (2025-06-09T10:41:34Z) - RobCaps: Evaluating the Robustness of Capsule Networks against Affine
Transformations and Adversarial Attacks [11.302789770501303]
Capsule Networks (CapsNets) are able to hierarchically preserve the pose relationships between multiple objects for image classification tasks.
In this paper, we evaluate different factors affecting the robustness of CapsNets, compared to traditional Conal Neural Networks (CNNs)
arXiv Detail & Related papers (2023-04-08T09:58:35Z) - MogaNet: Multi-order Gated Aggregation Network [61.842116053929736]
We propose a new family of modern ConvNets, dubbed MogaNet, for discriminative visual representation learning.<n>MogaNet encapsulates conceptually simple yet effective convolutions and gated aggregation into a compact module.<n>MogaNet exhibits great scalability, impressive efficiency of parameters, and competitive performance compared to state-of-the-art ViTs and ConvNets on ImageNet.
arXiv Detail & Related papers (2022-11-07T04:31:17Z) - Parallel Capsule Networks for Classification of White Blood Cells [1.5749416770494706]
Capsule Networks (CapsNets) is a machine learning architecture proposed to overcome some of the shortcomings of convolutional neural networks (CNNs)
We present a new architecture, parallel CapsNets, which exploits the concept of branching the network to isolate certain capsules.
arXiv Detail & Related papers (2021-08-05T14:30:44Z) - Interpretable Graph Capsule Networks for Object Recognition [17.62514568986647]
We propose interpretable Graph Capsule Networks (GraCapsNets), where we replace the routing part with a multi-head attention-based Graph Pooling approach.
GraCapsNets achieve better classification performance with fewer parameters and better adversarial robustness, when compared to CapsNets.
arXiv Detail & Related papers (2020-12-03T03:18:00Z) - ResNet or DenseNet? Introducing Dense Shortcuts to ResNet [80.35001540483789]
This paper presents a unified perspective of dense summation to analyze them.
We propose dense weighted normalized shortcuts as a solution to the dilemma between ResNet and DenseNet.
Our proposed DSNet achieves significantly better results than ResNet, and achieves comparable performance as DenseNet but requiring fewer resources.
arXiv Detail & Related papers (2020-10-23T16:00:15Z) - iCapsNets: Towards Interpretable Capsule Networks for Text
Classification [95.31786902390438]
Traditional machine learning methods are easy to interpret but have low accuracies.
We propose interpretable capsule networks (iCapsNets) to bridge this gap.
iCapsNets can be interpreted both locally and globally.
arXiv Detail & Related papers (2020-05-16T04:11:44Z) - Q-CapsNets: A Specialized Framework for Quantizing Capsule Networks [12.022910298030219]
Capsule Networks (CapsNets) have superior learning capabilities in machine learning tasks, like image classification, compared to the traditional CNNs.
CapsNets require extremely intense computations and are difficult to be deployed in their original form at the resource-constrained edge devices.
This paper makes the first attempt to quantize CapsNet models, to enable their efficient edge implementations, by developing a specialized quantization framework for CapsNets.
arXiv Detail & Related papers (2020-04-15T14:32:45Z) - Subspace Capsule Network [85.69796543499021]
SubSpace Capsule Network (SCN) exploits the idea of capsule networks to model possible variations in the appearance or implicitly defined properties of an entity.
SCN can be applied to both discriminative and generative models without incurring computational overhead compared to CNN during test time.
arXiv Detail & Related papers (2020-02-07T17:51:56Z) - Convolutional Networks with Dense Connectivity [59.30634544498946]
We introduce the Dense Convolutional Network (DenseNet), which connects each layer to every other layer in a feed-forward fashion.
For each layer, the feature-maps of all preceding layers are used as inputs, and its own feature-maps are used as inputs into all subsequent layers.
We evaluate our proposed architecture on four highly competitive object recognition benchmark tasks.
arXiv Detail & Related papers (2020-01-08T06:54:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.