论文标题

基于位置的缩放梯度用于模型量化和修剪

Position-based Scaled Gradient for Model Quantization and Pruning

论文作者

Kim, Jangho, Yoo, KiYoon, Kwak, Nojun

论文摘要

我们提出了基于位置的缩放梯度(PSG),该梯度(PSG)根据权重向量的位置缩放梯度,以使其更加压缩。首先,从理论上讲,将PSG应用于称为PSGD的标准梯度下降(GD),等效于扭曲的重量空间中的GD,这是通过通过适当设计的可逆功能扭动原始重量空间而制造的空间。其次,我们从经验上表明,充当权重矢量的正规器的PSG对模型压缩域(例如量化和修剪)有利。 PSG减少了全精度模型的重量分布与其压缩对应物之间的差距。这使模型可以作为未压缩模式或压缩模式的多功能部署,具体取决于资源的可用性。 CIFAR-10/100和Imagenet数据集的实验结果显示了所提出的PSG在修剪和量化域中的有效性,即使对于极低的位。该代码在GitHub中发布。

We propose the position-based scaled gradient (PSG) that scales the gradient depending on the position of a weight vector to make it more compression-friendly. First, we theoretically show that applying PSG to the standard gradient descent (GD), which is called PSGD, is equivalent to the GD in the warped weight space, a space made by warping the original weight space via an appropriately designed invertible function. Second, we empirically show that PSG acting as a regularizer to a weight vector is favorable for model compression domains such as quantization and pruning. PSG reduces the gap between the weight distributions of a full-precision model and its compressed counterpart. This enables the versatile deployment of a model either as an uncompressed mode or as a compressed mode depending on the availability of resources. The experimental results on CIFAR-10/100 and ImageNet datasets show the effectiveness of the proposed PSG in both domains of pruning and quantization even for extremely low bits. The code is released in Github.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源