论文标题
稀疏神经网络的周期修剪
Cyclical Pruning for Sparse Neural Networks
论文作者
论文摘要
当前的修剪神经网络权重的方法迭代地对模型权重进行基于幅度的修剪,并重新培训所得模型以恢复损失的精度。在这项工作中,我们表明这种策略不允许恢复错误修剪的权重。为了实现重量恢复,我们提出了一种简单的策略,称为\ textit {周期性修剪},该策略要求修剪时间表是周期性的,并允许在一个周期中错误地修剪权重,以便在随后的循环中恢复。线性模型和大规模深神经网络的实验结果表明,周期性修剪的表现优于现有的修剪算法,尤其是在高稀疏性比下。我们的方法很容易调整,并且可以很容易地将其纳入现有的修剪管道中以提高性能。
Current methods for pruning neural network weights iteratively apply magnitude-based pruning on the model weights and re-train the resulting model to recover lost accuracy. In this work, we show that such strategies do not allow for the recovery of erroneously pruned weights. To enable weight recovery, we propose a simple strategy called \textit{cyclical pruning} which requires the pruning schedule to be periodic and allows for weights pruned erroneously in one cycle to recover in subsequent ones. Experimental results on both linear models and large-scale deep neural networks show that cyclical pruning outperforms existing pruning algorithms, especially at high sparsity ratios. Our approach is easy to tune and can be readily incorporated into existing pruning pipelines to boost performance.