Normalization Techniques in Training DNNs: Methodology, Analysis and
Application
- URL: http://arxiv.org/abs/2009.12836v1
- Date: Sun, 27 Sep 2020 13:06:52 GMT
- Title: Normalization Techniques in Training DNNs: Methodology, Analysis and
Application
- Authors: Lei Huang, Jie Qin, Yi Zhou, Fan Zhu, Li Liu, Ling Shao
- Abstract summary: Normalization techniques are essential for accelerating the training and improving the generalization of deep neural networks (DNNs)
This paper reviews and comments on the past, present and future of normalization methods in the context of training.
- Score: 111.82265258916397
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Normalization techniques are essential for accelerating the training and
improving the generalization of deep neural networks (DNNs), and have
successfully been used in various applications. This paper reviews and comments
on the past, present and future of normalization methods in the context of DNN
training. We provide a unified picture of the main motivation behind different
approaches from the perspective of optimization, and present a taxonomy for
understanding the similarities and differences between them. Specifically, we
decompose the pipeline of the most representative normalizing activation
methods into three components: the normalization area partitioning,
normalization operation and normalization representation recovery. In doing so,
we provide insight for designing new normalization technique. Finally, we
discuss the current progress in understanding normalization methods, and
provide a comprehensive review of the applications of normalization for
particular tasks, in which it can effectively solve the key issues.
Related papers
- Enhancing Neural Network Representations with Prior Knowledge-Based Normalization [0.07499722271664146]
We introduce a new approach to multi-mode normalization that leverages prior knowledge to improve neural network representations.
Our methods demonstrate superior convergence and performance across tasks in image classification, domain adaptation, and image generation.
arXiv Detail & Related papers (2024-03-25T14:17:38Z) - AFN: Adaptive Fusion Normalization via an Encoder-Decoder Framework [6.293148047652131]
We propose a new normalization function called Adaptive Fusion Normalization.
Through experiments, we demonstrate AFN outperforms the previous normalization techniques in domain generalization and image classification tasks.
arXiv Detail & Related papers (2023-08-07T06:08:51Z) - NormAUG: Normalization-guided Augmentation for Domain Generalization [60.159546669021346]
We propose a simple yet effective method called NormAUG (Normalization-guided Augmentation) for deep learning.
Our method introduces diverse information at the feature level and improves the generalization of the main path.
In the test stage, we leverage an ensemble strategy to combine the predictions from the auxiliary path of our model, further boosting performance.
arXiv Detail & Related papers (2023-07-25T13:35:45Z) - TANGOS: Regularizing Tabular Neural Networks through Gradient
Orthogonalization and Specialization [69.80141512683254]
We introduce Tabular Neural Gradient Orthogonalization and gradient (TANGOS)
TANGOS is a novel framework for regularization in the tabular setting built on latent unit attributions.
We demonstrate that our approach can lead to improved out-of-sample generalization performance, outperforming other popular regularization methods.
arXiv Detail & Related papers (2023-03-09T18:57:13Z) - Hierarchical Normalization for Robust Monocular Depth Estimation [85.2304122536962]
We propose a novel multi-scale depth normalization method that hierarchically normalizes the depth representations based on spatial information and depth.
Our experiments show that the proposed normalization strategy remarkably outperforms previous normalization methods.
arXiv Detail & Related papers (2022-10-18T08:18:29Z) - A Systematic Survey of Regularization and Normalization in GANs [25.188671290175208]
Generative Adversarial Networks (GANs) have been widely applied in different scenarios thanks to the development of deep neural networks.
It is still unknown whether GANs can fit the target distribution without any prior information.
Regularization and normalization are common methods of introducing prior information to stabilize training and improve discrimination.
arXiv Detail & Related papers (2020-08-19T12:52:10Z) - On Connections between Regularizations for Improving DNN Robustness [67.28077776415724]
This paper analyzes regularization terms proposed recently for improving the adversarial robustness of deep neural networks (DNNs)
We study possible connections between several effective methods, including input-gradient regularization, Jacobian regularization, curvature regularization, and a cross-Lipschitz functional.
arXiv Detail & Related papers (2020-07-04T23:43:32Z) - New Interpretations of Normalization Methods in Deep Learning [41.29746794151102]
We use these tools to make a deep analysis on popular normalization methods.
Most of the normalization methods can be interpreted in a unified framework.
We prove that training with these normalization methods can make the norm of weights increase, which could cause adversarial vulnerability as it amplifies the attack.
arXiv Detail & Related papers (2020-06-16T12:26:13Z) - Optimization Theory for ReLU Neural Networks Trained with Normalization
Layers [82.61117235807606]
The success of deep neural networks in part due to the use of normalization layers.
Our analysis shows how the introduction of normalization changes the landscape and can enable faster activation.
arXiv Detail & Related papers (2020-06-11T23:55:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.