Attention Mechanism Meets with Hybrid Dense Network for Hyperspectral
Image Classification
- URL: http://arxiv.org/abs/2201.01001v1
- Date: Tue, 4 Jan 2022 06:30:24 GMT
- Title: Attention Mechanism Meets with Hybrid Dense Network for Hyperspectral
Image Classification
- Authors: Muhammad Ahmad, Adil Mehmood Khan, Manuel Mazzara, Salvatore
Distefano, Swalpa Kumar Roy and Xin Wu
- Abstract summary: Convolutional Neural Networks (CNN) are more suitable, indeed.
fixed kernel sizes make traditional CNN too specific, neither flexible nor conducive to feature learning, thus impacting on the classification accuracy.
The proposed solution aims at combining the core idea of 3D and 2D Inception net with the Attention mechanism to boost the HSIC CNN performance in a hybrid scenario.
The resulting textitattention-fused hybrid network (AfNet) is based on three attention-fused parallel hybrid sub-nets with different kernels in each block repeatedly using high-level features to enhance the final ground-truth maps.
- Score: 6.946336514955953
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Convolutional Neural Networks (CNN) are more suitable, indeed. However, fixed
kernel sizes make traditional CNN too specific, neither flexible nor conducive
to feature learning, thus impacting on the classification accuracy. The
convolution of different kernel size networks may overcome this problem by
capturing more discriminating and relevant information. In light of this, the
proposed solution aims at combining the core idea of 3D and 2D Inception net
with the Attention mechanism to boost the HSIC CNN performance in a hybrid
scenario. The resulting \textit{attention-fused hybrid network} (AfNet) is
based on three attention-fused parallel hybrid sub-nets with different kernels
in each block repeatedly using high-level features to enhance the final
ground-truth maps. In short, AfNet is able to selectively filter out the
discriminative features critical for classification. Several tests on HSI
datasets provided competitive results for AfNet compared to state-of-the-art
models. The proposed pipeline achieved, indeed, an overall accuracy of 97\% for
the Indian Pines, 100\% for Botswana, 99\% for Pavia University, Pavia Center,
and Salinas datasets.
Related papers
- Learning CNN on ViT: A Hybrid Model to Explicitly Class-specific Boundaries for Domain Adaptation [13.753795233064695]
Most domain adaptation (DA) methods are based on either a convolutional neural networks (CNNs) or a vision transformers (ViTs)
We design a hybrid method to fully take advantage of both ViT and CNN, called Explicitly Class-specific Boundaries (ECB)
ECB learns CNN on ViT to combine their distinct strengths.
arXiv Detail & Related papers (2024-03-27T08:52:44Z) - Hybrid CNN Bi-LSTM neural network for Hyperspectral image classification [1.2691047660244332]
This paper proposes a neural network combining 3-D CNN, 2-D CNN and Bi-LSTM.
It could achieve 99.83, 99.98 and 100 percent accuracy using only 30 percent trainable parameters of the state-of-art model in IP, PU and SA datasets respectively.
arXiv Detail & Related papers (2024-02-15T15:46:13Z) - Fine-grained Recognition with Learnable Semantic Data Augmentation [68.48892326854494]
Fine-grained image recognition is a longstanding computer vision challenge.
We propose diversifying the training data at the feature-level to alleviate the discriminative region loss problem.
Our method significantly improves the generalization performance on several popular classification networks.
arXiv Detail & Related papers (2023-09-01T11:15:50Z) - Sharpend Cosine Similarity based Neural Network for Hyperspectral Image
Classification [0.456877715768796]
Hyperspectral Image Classification (HSIC) is a difficult task due to high inter and intra-class similarity and variability, nested regions, and overlapping.
2D Convolutional Neural Networks (CNN) emerged as a viable network whereas, 3D CNNs are a better alternative due to accurate classification.
This paper introduces Sharpened Cosine Similarity (SCS) concept as an alternative to convolutions in a Neural Network for HSIC.
arXiv Detail & Related papers (2023-05-26T07:04:00Z) - HAC-Net: A Hybrid Attention-Based Convolutional Neural Network for
Highly Accurate Protein-Ligand Binding Affinity Prediction [0.0]
We present a novel deep learning architecture consisting of a 3-dimensional convolutional neural network and two graph convolutional networks.
HAC-Net obtains state-of-the-art results on the PDBbind v.2016 core set.
We envision that this model can be extended to a broad range of supervised learning problems related to structure-based biomolecular property prediction.
arXiv Detail & Related papers (2022-12-23T16:14:53Z) - SVNet: Where SO(3) Equivariance Meets Binarization on Point Cloud
Representation [65.4396959244269]
The paper tackles the challenge by designing a general framework to construct 3D learning architectures.
The proposed approach can be applied to general backbones like PointNet and DGCNN.
Experiments on ModelNet40, ShapeNet, and the real-world dataset ScanObjectNN, demonstrated that the method achieves a great trade-off between efficiency, rotation, and accuracy.
arXiv Detail & Related papers (2022-09-13T12:12:19Z) - BreakingBED -- Breaking Binary and Efficient Deep Neural Networks by
Adversarial Attacks [65.2021953284622]
We study robustness of CNNs against white-box and black-box adversarial attacks.
Results are shown for distilled CNNs, agent-based state-of-the-art pruned models, and binarized neural networks.
arXiv Detail & Related papers (2021-03-14T20:43:19Z) - Fusion of CNNs and statistical indicators to improve image
classification [65.51757376525798]
Convolutional Networks have dominated the field of computer vision for the last ten years.
Main strategy to prolong this trend relies on further upscaling networks in size.
We hypothesise that adding heterogeneous sources of information may be more cost-effective to a CNN than building a bigger network.
arXiv Detail & Related papers (2020-12-20T23:24:31Z) - Finite Versus Infinite Neural Networks: an Empirical Study [69.07049353209463]
kernel methods outperform fully-connected finite-width networks.
Centered and ensembled finite networks have reduced posterior variance.
Weight decay and the use of a large learning rate break the correspondence between finite and infinite networks.
arXiv Detail & Related papers (2020-07-31T01:57:47Z) - Hyperspectral Classification Based on 3D Asymmetric Inception Network
with Data Fusion Transfer Learning [36.05574127972413]
We first deliver a 3D asymmetric inception network, AINet, to overcome the overfitting problem.
With the emphasis on spectral signatures over spatial contexts of HSI data, AINet can convey and classify the features effectively.
arXiv Detail & Related papers (2020-02-11T06:37:34Z) - Approximation and Non-parametric Estimation of ResNet-type Convolutional
Neural Networks [52.972605601174955]
We show a ResNet-type CNN can attain the minimax optimal error rates in important function classes.
We derive approximation and estimation error rates of the aformentioned type of CNNs for the Barron and H"older classes.
arXiv Detail & Related papers (2019-03-24T19:42:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.