An Adaptive Batch Normalization in Deep Learning
- URL: http://arxiv.org/abs/2211.02050v1
- Date: Thu, 3 Nov 2022 12:12:56 GMT
- Title: An Adaptive Batch Normalization in Deep Learning
- Authors: Wael Alsobhi, Tarik Alafif, Alaa Abdel-Hakim, Weiwei Zong
- Abstract summary: Batch Normalization (BN) is a way to accelerate and stabilize training in deep convolutional neural networks.
We propose a threshold-based adaptive BN approach that separates the data that requires the BN and data that does not require it.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Batch Normalization (BN) is a way to accelerate and stabilize training in
deep convolutional neural networks. However, the BN works continuously within
the network structure, although some training data may not always require it.
In this research work, we propose a threshold-based adaptive BN approach that
separates the data that requires the BN and data that does not require it. The
experimental evaluation demonstrates that proposed approach achieves better
performance mostly in small batch sizes than the traditional BN using MNIST,
Fashion-MNIST, CIFAR-10, and CIFAR-100. It also reduces the occurrence of
internal variable transformation to increase network stability
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.