论文标题
合作协调的神经网络修剪
Neural Network Pruning by Cooperative Coevolution
论文作者
论文摘要
神经网络修剪是一种流行的模型压缩方法,可以显着降低计算成本,而准确性丧失。最近,过滤器通常通过设计适当的标准或使用辅助模块来衡量其重要性而直接修剪过滤器,但是,这需要专业知识和反复试验。由于自动化的优势,进化算法(EAS)的修剪引起了很多关注,但是由于搜索空间很大,因此对深神经网络的性能受到限制。在本文中,我们提出了一种由合作进化的新的过滤器修剪算法CCEP,该算法通过EAS分别将过滤器修剪到每一层中的过滤器。也就是说,CCEP通过划分和拼接策略降低了修剪空间。该实验表明,CCEP可以通过最先进的修剪方法获得竞争性能,例如Prune Resnet56,价格为$ 63.42 \%$ $ $ $ $ $ $ $ $ $ 0.24 \%\%$ $精度下降,并以$ 44.56 \%的$ 0.07 $ $ 0.07 $ $ 0.07 \%准确地重置$ 0.07 \%flops。
Neural network pruning is a popular model compression method which can significantly reduce the computing cost with negligible loss of accuracy. Recently, filters are often pruned directly by designing proper criteria or using auxiliary modules to measure their importance, which, however, requires expertise and trial-and-error. Due to the advantage of automation, pruning by evolutionary algorithms (EAs) has attracted much attention, but the performance is limited for deep neural networks as the search space can be quite large. In this paper, we propose a new filter pruning algorithm CCEP by cooperative coevolution, which prunes the filters in each layer by EAs separately. That is, CCEP reduces the pruning space by a divide-and-conquer strategy. The experiments show that CCEP can achieve a competitive performance with the state-of-the-art pruning methods, e.g., prune ResNet56 for $63.42\%$ FLOPs on CIFAR10 with $-0.24\%$ accuracy drop, and ResNet50 for $44.56\%$ FLOPs on ImageNet with $0.07\%$ accuracy drop.