论文标题

对称正则化和饱和的非线性,以量化稳健量化

Symmetry Regularization and Saturating Nonlinearity for Robust Quantization

论文作者

Park, Sein, Jang, Yeongsang, Park, Eunhyeok

论文摘要

强大的量化提高了网络对各种实现的容忍度,从而允许在不同的位宽度或零散的低精度算术中可靠的输出。在这项工作中,我们进行了广泛的分析以确定量化误差的来源,并提出了三个见解,以鲁棒化网络,以防止量化:减少误差传播,范围夹紧误差最小化以及遗传了抗量化的鲁棒性。基于这些见解,我们提出了两种称为对称正则化(Symreg)和饱和非线性(SATNL)的新方法。在训练过程中应用所提出的方法可以增强对现有的训练后量化(PTQ)和量化感知训练(QAT)算法的量化的鲁棒性,并使我们能够获得足够灵活的单一重量,以在各种条件下保持输出质量。我们对CIFAR和Imagenet数据集进行了广泛的研究,并验证了所提出的方法的有效性。

Robust quantization improves the tolerance of networks for various implementations, allowing reliable output in different bit-widths or fragmented low-precision arithmetic. In this work, we perform extensive analyses to identify the sources of quantization error and present three insights to robustify a network against quantization: reduction of error propagation, range clamping for error minimization, and inherited robustness against quantization. Based on these insights, we propose two novel methods called symmetry regularization (SymReg) and saturating nonlinearity (SatNL). Applying the proposed methods during training can enhance the robustness of arbitrary neural networks against quantization on existing post-training quantization (PTQ) and quantization-aware training (QAT) algorithms and enables us to obtain a single weight flexible enough to maintain the output quality under various conditions. We conduct extensive studies on CIFAR and ImageNet datasets and validate the effectiveness of the proposed methods.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源