A study on the deviations in performance of FNNs and CNNs in the realm
of grayscale adversarial images
- URL: http://arxiv.org/abs/2209.08262v1
- Date: Sat, 17 Sep 2022 06:25:14 GMT
- Title: A study on the deviations in performance of FNNs and CNNs in the realm
of grayscale adversarial images
- Authors: Durga Shree Nagabushanam, Steve Mathew, Chiranji Lal Chowdhary
- Abstract summary: We show that Neural Networks are prone to having lesser accuracy in the classification of images with noise perturbation.
In our study, we have used the hand-written digits dataset, MNIST with the following architectures: FNNs with 1 and 2 hidden layers and CNNs with 3, 4, 6 and 8 convolutions and analyzed their accuracies.
FNNs stand out to show that irrespective of the intensity of noise, they have a classification accuracy of more than 85%.
- Score: 0.3437656066916039
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural Networks are prone to having lesser accuracy in the classification of
images with noise perturbation. Convolutional Neural Networks, CNNs are known
for their unparalleled accuracy in the classification of benign images. But our
study shows that they are extremely vulnerable to noise addition while
Feed-forward Neural Networks, FNNs show very less correspondence with noise
perturbation, maintaining their accuracy almost undisturbed. FNNs are observed
to be better at classifying noise-intensive, single-channeled images that are
just sheer noise to human vision. In our study, we have used the hand-written
digits dataset, MNIST with the following architectures: FNNs with 1 and 2
hidden layers and CNNs with 3, 4, 6 and 8 convolutions and analyzed their
accuracies. FNNs stand out to show that irrespective of the intensity of noise,
they have a classification accuracy of more than 85%. In our analysis of CNNs
with this data, the deceleration of classification accuracy of CNN with 8
convolutions was half of that of the rest of the CNNs. Correlation analysis and
mathematical modelling of the accuracy trends act as roadmaps to these
conclusions.
Related papers
- CNN2GNN: How to Bridge CNN with GNN [59.42117676779735]
We propose a novel CNN2GNN framework to unify CNN and GNN together via distillation.
The performance of distilled boosted'' two-layer GNN on Mini-ImageNet is much higher than CNN containing dozens of layers such as ResNet152.
arXiv Detail & Related papers (2024-04-23T08:19:08Z) - OA-CNNs: Omni-Adaptive Sparse CNNs for 3D Semantic Segmentation [70.17681136234202]
We reexamine the design distinctions and test the limits of what a sparse CNN can achieve.
We propose two key components, i.e., adaptive receptive fields (spatially) and adaptive relation, to bridge the gap.
This exploration led to the creation of Omni-Adaptive 3D CNNs (OA-CNNs), a family of networks that integrates a lightweight module.
arXiv Detail & Related papers (2024-03-21T14:06:38Z) - Decoupled Mixup for Generalized Visual Recognition [71.13734761715472]
We propose a novel "Decoupled-Mixup" method to train CNN models for visual recognition.
Our method decouples each image into discriminative and noise-prone regions, and then heterogeneously combines these regions to train CNN models.
Experiment results show the high generalization performance of our method on testing data that are composed of unseen contexts.
arXiv Detail & Related papers (2022-10-26T15:21:39Z) - Improving the Accuracy and Robustness of CNNs Using a Deep CCA Neural
Data Regularizer [2.026424957803652]
As convolutional neural networks (CNNs) become more accurate at object recognition, their representations become more similar to the primate visual system.
Previous attempts to address this question showed very modest gains in accuracy, owing in part to limitations of the regularization method.
We develop a new neural data regularizer for CNNs that uses Deep Correlation Analysis (DCCA) to optimize the resemblance of the CNN's image representations to that of the monkey visual cortex.
arXiv Detail & Related papers (2022-09-06T15:40:39Z) - Neural Architecture Dilation for Adversarial Robustness [56.18555072877193]
A shortcoming of convolutional neural networks is that they are vulnerable to adversarial attacks.
This paper aims to improve the adversarial robustness of the backbone CNNs that have a satisfactory accuracy.
Under a minimal computational overhead, a dilation architecture is expected to be friendly with the standard performance of the backbone CNN.
arXiv Detail & Related papers (2021-08-16T03:58:00Z) - Receptive Field Regularization Techniques for Audio Classification and
Tagging with Deep Convolutional Neural Networks [7.9495796547433395]
We show that tuning the Receptive Field (RF) of CNNs is crucial to their generalization.
We propose several systematic approaches to control the RF of CNNs and systematically test the resulting architectures.
arXiv Detail & Related papers (2021-05-26T08:36:29Z) - BreakingBED -- Breaking Binary and Efficient Deep Neural Networks by
Adversarial Attacks [65.2021953284622]
We study robustness of CNNs against white-box and black-box adversarial attacks.
Results are shown for distilled CNNs, agent-based state-of-the-art pruned models, and binarized neural networks.
arXiv Detail & Related papers (2021-03-14T20:43:19Z) - Homography Estimation with Convolutional Neural Networks Under
Conditions of Variance [0.0]
We analyze the performance of two recently published methods using Convolutional Neural Networks (CNNs)
CNNs can be trained to be more robust against noise, but at a small cost to accuracy in the noiseless case.
We show that training a CNN to a specific magnitude of noise leads to a "Goldilocks Zone" with regard to the noise levels where that CNN performs best.
arXiv Detail & Related papers (2020-10-02T15:11:25Z) - Hybrid Tiled Convolutional Neural Networks for Text Sentiment
Classification [3.0204693431381515]
We adjust the architecture of the tiled convolutional neural network (tiled CNN) to improve its extraction of salient features for sentiment analysis.
Knowing that the major drawback of the tiled CNN in the NLP field is its inflexible filter structure, we propose a novel architecture called hybrid tiled CNN.
Experiments on the datasets of IMDB movie reviews and SemEval 2017 demonstrate the efficiency of the hybrid tiled CNN.
arXiv Detail & Related papers (2020-01-31T14:08:15Z) - Approximation and Non-parametric Estimation of ResNet-type Convolutional
Neural Networks [52.972605601174955]
We show a ResNet-type CNN can attain the minimax optimal error rates in important function classes.
We derive approximation and estimation error rates of the aformentioned type of CNNs for the Barron and H"older classes.
arXiv Detail & Related papers (2019-03-24T19:42:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.