A Discriminative Channel Diversification Network for Image
Classification
- URL: http://arxiv.org/abs/2112.05861v1
- Date: Fri, 10 Dec 2021 23:00:53 GMT
- Title: A Discriminative Channel Diversification Network for Image
Classification
- Authors: Krushi Patel, Guanghui Wang
- Abstract summary: We propose a light-weight and effective attention module, called channel diversification block, to enhance the global context.
Unlike other channel attention mechanisms, the proposed module focuses on the most discriminative features.
Experiments on CIFAR-10, SVHN, and Tiny-ImageNet datasets demonstrate that the proposed module improves the performance of the baseline networks by a margin of 3% on average.
- Score: 21.049734250642974
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Channel attention mechanisms in convolutional neural networks have been
proven to be effective in various computer vision tasks. However, the
performance improvement comes with additional model complexity and computation
cost. In this paper, we propose a light-weight and effective attention module,
called channel diversification block, to enhance the global context by
establishing the channel relationship at the global level. Unlike other channel
attention mechanisms, the proposed module focuses on the most discriminative
features by giving more attention to the spatially distinguishable channels
while taking account of the channel activation. Different from other attention
models that plugin the module in between several intermediate layers, the
proposed module is embedded at the end of the backbone networks, making it easy
to implement. Extensive experiments on CIFAR-10, SVHN, and Tiny-ImageNet
datasets demonstrate that the proposed module improves the performance of the
baseline networks by a margin of 3% on average.
Related papers
- SOFTS: Efficient Multivariate Time Series Forecasting with Series-Core Fusion [59.96233305733875]
Time series forecasting plays a crucial role in various fields such as finance, traffic management, energy, and healthcare.
Several methods utilize mechanisms like attention or mixer to address this by capturing channel correlations.
This paper presents an efficient-based model, the Series-cOre Fused Time Series forecaster (SOFTS)
arXiv Detail & Related papers (2024-04-22T14:06:35Z) - MCA: Moment Channel Attention Networks [10.780493635885225]
We investigate the statistical moments of feature maps within a neural network.
Our findings highlight the critical role of high-order moments in enhancing model capacity.
We propose the Moment Channel Attention (MCA) framework, which efficiently incorporates multiple levels of moment-based information.
arXiv Detail & Related papers (2024-03-04T04:02:59Z) - Joint Channel Estimation and Feedback with Masked Token Transformers in
Massive MIMO Systems [74.52117784544758]
This paper proposes an encoder-decoder based network that unveils the intrinsic frequency-domain correlation within the CSI matrix.
The entire encoder-decoder network is utilized for channel compression.
Our method outperforms state-of-the-art channel estimation and feedback techniques in joint tasks.
arXiv Detail & Related papers (2023-06-08T06:15:17Z) - Efficient Multi-Scale Attention Module with Cross-Spatial Learning [4.046170185945849]
A novel efficient multi-scale attention (EMA) module is proposed.
We focus on retaining the information on per channel and decreasing the computational overhead.
We conduct extensive ablation studies and experiments on image classification and object detection tasks.
arXiv Detail & Related papers (2023-05-23T00:35:47Z) - A Generic Shared Attention Mechanism for Various Backbone Neural Networks [53.36677373145012]
Self-attention modules (SAMs) produce strongly correlated attention maps across different layers.
Dense-and-Implicit Attention (DIA) shares SAMs across layers and employs a long short-term memory module.
Our simple yet effective DIA can consistently enhance various network backbones.
arXiv Detail & Related papers (2022-10-27T13:24:08Z) - Global Attention Mechanism: Retain Information to Enhance
Channel-Spatial Interactions [1.4438155481047366]
We propose a global attention mechanism that boosts the performance of deep neural networks by reducing information reduction and magnifying the global interactive representations.
The evaluation of the proposed mechanism for the image classification task on CIFAR-100 and ImageNet-1K indicates that our method stably outperforms several recent attention mechanisms with both ResNet and lightweight MobileNet.
arXiv Detail & Related papers (2021-12-10T14:12:32Z) - Convolutional Neural Network optimization via Channel Reassessment
Attention module [19.566271646280978]
We propose a novel network optimization module called Channel Reassessment (CRA) module.
CRA module uses channel attentions with spatial information of feature maps to enhance representational power of networks.
Experiments on ImageNet and MS datasets demonstrate that embedding CRA module on various networks effectively improves the performance under different evaluation standards.
arXiv Detail & Related papers (2020-10-12T11:27:17Z) - Single Image Super-Resolution via a Holistic Attention Network [87.42409213909269]
We propose a new holistic attention network (HAN) to model the holistic interdependencies among layers, channels, and positions.
The proposed HAN adaptively emphasizes hierarchical features by considering correlations among layers.
Experiments demonstrate that the proposed HAN performs favorably against the state-of-the-art single image super-resolution approaches.
arXiv Detail & Related papers (2020-08-20T04:13:15Z) - Channel Interaction Networks for Fine-Grained Image Categorization [61.095320862647476]
Fine-grained image categorization is challenging due to the subtle inter-class differences.
We propose a channel interaction network (CIN), which models the channel-wise interplay both within an image and across images.
Our model can be trained efficiently in an end-to-end fashion without the need of multi-stage training and testing.
arXiv Detail & Related papers (2020-03-11T11:51:51Z) - Global Context-Aware Progressive Aggregation Network for Salient Object
Detection [117.943116761278]
We propose a novel network named GCPANet to integrate low-level appearance features, high-level semantic features, and global context features.
We show that the proposed approach outperforms the state-of-the-art methods both quantitatively and qualitatively.
arXiv Detail & Related papers (2020-03-02T04:26:10Z) - Hybrid Multiple Attention Network for Semantic Segmentation in Aerial
Images [24.35779077001839]
We propose a novel attention-based framework named Hybrid Multiple Attention Network (HMANet) to adaptively capture global correlations.
We introduce a simple yet effective region shuffle attention (RSA) module to reduce feature redundant and improve the efficiency of self-attention mechanism.
arXiv Detail & Related papers (2020-01-09T07:47:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.