BCN: Batch Channel Normalization for Image Classification
- URL: http://arxiv.org/abs/2312.00596v1
- Date: Fri, 1 Dec 2023 14:01:48 GMT
- Title: BCN: Batch Channel Normalization for Image Classification
- Authors: Afifa Khaled, Chao Li, Jia Ning, Kun He
- Abstract summary: This paper presents a novel normalization technique called Batch Channel Normalization (BCN)
As a basic block, BCN can be easily integrated into existing models for various applications in the field of computer vision.
- Score: 13.262032378453073
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Normalization techniques have been widely used in the field of deep learning
due to their capability of enabling higher learning rates and are less careful
in initialization. However, the effectiveness of popular normalization
technologies is typically limited to specific areas. Unlike the standard Batch
Normalization (BN) and Layer Normalization (LN), where BN computes the mean and
variance along the (N,H,W) dimensions and LN computes the mean and variance
along the (C,H,W) dimensions (N, C, H and W are the batch, channel, spatial
height and width dimension, respectively), this paper presents a novel
normalization technique called Batch Channel Normalization (BCN). To exploit
both the channel and batch dependence and adaptively and combine the advantages
of BN and LN based on specific datasets or tasks, BCN separately normalizes
inputs along the (N, H, W) and (C, H, W) axes, then combines the normalized
outputs based on adaptive parameters. As a basic block, BCN can be easily
integrated into existing models for various applications in the field of
computer vision. Empirical results show that the proposed technique can be
seamlessly applied to various versions of CNN or Vision Transformer
architecture. The code is publicly available at
https://github.com/AfifaKhaled/BatchChannel-Normalization
Related papers
- Exploring the Efficacy of Group-Normalization in Deep Learning Models for Alzheimer's Disease Classification [2.6447365674762273]
Group Normalization is an easy alternative to Batch Normalization.
GN achieves a very low error rate of 10.6% compared to Batch Normalization.
arXiv Detail & Related papers (2024-04-01T06:10:11Z) - Context Normalization Layer with Applications [0.1499944454332829]
This study proposes a new normalization technique, called context normalization, for image data.
It adjusts the scaling of features based on the characteristics of each sample, which improves the model's convergence speed and performance.
The effectiveness of context normalization is demonstrated on various datasets, and its performance is compared to other standard normalization techniques.
arXiv Detail & Related papers (2023-03-14T06:38:17Z) - Batch Layer Normalization, A new normalization layer for CNNs and RNN [0.0]
This study introduces a new normalization layer termed Batch Layer Normalization (BLN)
As a combined version of batch and layer normalization, BLN adaptively puts appropriate weight on mini-batch and feature normalization based on the inverse size of mini-batches.
Test results indicate the application potential of BLN and its faster convergence than batch normalization and layer normalization in both Convolutional and Recurrent Neural Networks.
arXiv Detail & Related papers (2022-09-19T10:12:51Z) - Revisiting Batch Normalization [0.0]
Batch normalization (BN) is essential for training deep neural networks.
We revisit the BN formulation and present a new method and update approach for BN to address the aforementioned issues.
Experimental results using the proposed alterations to BN show statistically significant performance gains in a variety of scenarios.
We also present a new online BN-based input data normalization technique to alleviate the need for other offline or fixed methods.
arXiv Detail & Related papers (2021-10-26T19:48:19Z) - Batch Group Normalization [45.03388237812212]
Batch Normalization (BN) performs well at medium and large batch sizes.
BN saturates at small/extreme large batch sizes due to noisy/confused statistic calculation.
BGN is proposed to solve the noisy/confused statistic calculation of BN at small/extreme large batch sizes.
arXiv Detail & Related papers (2020-12-04T18:57:52Z) - Double Forward Propagation for Memorized Batch Normalization [68.34268180871416]
Batch Normalization (BN) has been a standard component in designing deep neural networks (DNNs)
We propose a memorized batch normalization (MBN) which considers multiple recent batches to obtain more accurate and robust statistics.
Compared to related methods, the proposed MBN exhibits consistent behaviors in both training and inference.
arXiv Detail & Related papers (2020-10-10T08:48:41Z) - PowerNorm: Rethinking Batch Normalization in Transformers [96.14956636022957]
normalization method for neural network (NN) models used in Natural Language Processing (NLP) is layer normalization (LN)
LN is preferred due to the empirical observation that a (naive/vanilla) use of BN leads to significant performance degradation for NLP tasks.
We propose Power Normalization (PN), a novel normalization scheme that resolves this issue.
arXiv Detail & Related papers (2020-03-17T17:50:26Z) - Cross-Iteration Batch Normalization [67.83430009388678]
We present Cross-It Batch Normalization (CBN), in which examples from multiple recent iterations are jointly utilized to enhance estimation quality.
CBN is found to outperform the original batch normalization and a direct calculation of statistics over previous iterations without the proposed compensation technique.
arXiv Detail & Related papers (2020-02-13T18:52:57Z) - Towards Stabilizing Batch Statistics in Backward Propagation of Batch
Normalization [126.6252371899064]
Moving Average Batch Normalization (MABN) is a novel normalization method.
We show that MABN can completely restore the performance of vanilla BN in small batch cases.
Our experiments demonstrate the effectiveness of MABN in multiple computer vision tasks including ImageNet and COCO.
arXiv Detail & Related papers (2020-01-19T14:41:22Z) - Region Normalization for Image Inpainting [52.17610250998762]
Feature Normalization (FN) is an important technique to help neural network training, which typically normalizes features across spatial dimensions.
In this work, we show that the mean and variance shifts caused by full-spatial FN limit the image inpainting network training.
We propose a spatial region-wise normalization named Region Normalization (RN) to overcome the limitation.
arXiv Detail & Related papers (2019-11-23T15:16:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.