Loading [a11y]/accessibility-menu.js
BNET: Batch Normalization With Enhanced Linear Transformation | IEEE Journals & Magazine | IEEE Xplore

BNET: Batch Normalization With Enhanced Linear Transformation


Abstract:

Batch normalization (BN) is a fundamental unit in modern deep neural networks. However, BN and its variants focus on normalization statistics but neglect the recovery ste...Show More

Abstract:

Batch normalization (BN) is a fundamental unit in modern deep neural networks. However, BN and its variants focus on normalization statistics but neglect the recovery step that uses linear transformation to improve the capacity of fitting complex data distributions. In this paper, we demonstrate that the recovery step can be improved by aggregating the neighborhood of each neuron rather than just considering a single neuron. Specifically, we propose a simple yet effective method named batch normalization with enhanced linear transformation (BNET) to embed spatial contextual information and improve representation ability. BNET can be easily implemented using the depth-wise convolution and seamlessly transplanted into existing architectures with BN. To our best knowledge, BNET is the first attempt to enhance the recovery step for BN. Furthermore, BN is interpreted as a special case of BNET from both spatial and spectral views. Experimental results demonstrate that BNET achieves consistent performance gains based on various backbones in a wide range of visual tasks. Moreover, BNET can accelerate the convergence of network training and enhance spatial information by assigning important neurons with large weights accordingly.
Page(s): 9225 - 9232
Date of Publication: 09 January 2023

ISSN Information:

PubMed ID: 37018583

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.