论文标题

$ l_2 $ bn:通过均衡$ l_2 $功能规范来增强批量归一化

$L_2$BN: Enhancing Batch Normalization by Equalizing the $L_2$ Norms of Features

论文作者

Wang, Zhennan, Li, Kehan, Yu, Runyi, Zhao, Yian, Qiao, Pengchong, Liu, Chang, Xu, Fan, Ji, Xiangyang, Song, Guoli, Chen, Jie

论文摘要

在本文中,我们从可区分性的角度分析了批处理标准化,并发现先前研究所忽略的缺点:$ L_2 $样本特征规范的差异可能会阻碍批处理归一化,无法获得更杰出的阶层间特征和更紧凑的内部内部特征。为了解决这个问题,我们提出了一种简单但有效的方法,以均衡$ L_2 $样本功能规范。具体而言,我们$ l_2 $ - 将每个示例功能归一致,然后将它们馈入批量归一化,因此特征的幅度相同。由于提出的方法结合了$ L_2 $归一化和批量归一化,因此我们将其命名为$ l_2 $ bn。 $ l_2 $ bn可以增强阶层内特征的紧凑性,并扩大课堂间特征的差异。 $ l_2 $ bn易于实现,并且可以在没有任何其他参数或超参数的情况下发挥其效果。我们通过对图像分类和声学场景分类任务进行各种模型的广泛实验来评估$ l_2 $ bn的有效性。结果表明,$ l_2 $ bn可以提高各种神经网络模型的概括能力,并取得大量的性能改进。

In this paper, we analyze batch normalization from the perspective of discriminability and find the disadvantages ignored by previous studies: the difference in $l_2$ norms of sample features can hinder batch normalization from obtaining more distinguished inter-class features and more compact intra-class features. To address this issue, we propose a simple yet effective method to equalize the $l_2$ norms of sample features. Concretely, we $l_2$-normalize each sample feature before feeding them into batch normalization, and therefore the features are of the same magnitude. Since the proposed method combines the $l_2$ normalization and batch normalization, we name our method $L_2$BN. The $L_2$BN can strengthen the compactness of intra-class features and enlarge the discrepancy of inter-class features. The $L_2$BN is easy to implement and can exert its effect without any additional parameters or hyper-parameters. We evaluate the effectiveness of $L_2$BN through extensive experiments with various models on image classification and acoustic scene classification tasks. The results demonstrate that the $L_2$BN can boost the generalization ability of various neural network models and achieve considerable performance improvements.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源