论文标题
使用Gibbs分布的神经网络修剪的框架
A Framework for Neural Network Pruning Using Gibbs Distributions
论文作者
论文摘要
现代深层神经网络通常太大,无法在许多实际情况下使用。神经网络修剪是减少此类模型大小和加速推理的重要技术。吉布斯修剪是表达和设计神经网络修剪方法的新型框架。结合了统计物理学和随机正规化方法的方法,它可以同时训练和修剪网络,以使学习的权重和修剪面罩相互适应。它可用于结构化或非结构化的修剪,我们为每种提出了许多特定方法。我们将提出的方法与许多当代神经网络修剪方法进行比较,并发现吉布斯修剪的表现优于它们。特别是,我们使用CIFAR-10数据集实现了修剪Resnet-56的新最新结果。
Modern deep neural networks are often too large to use in many practical scenarios. Neural network pruning is an important technique for reducing the size of such models and accelerating inference. Gibbs pruning is a novel framework for expressing and designing neural network pruning methods. Combining approaches from statistical physics and stochastic regularization methods, it can train and prune a network simultaneously in such a way that the learned weights and pruning mask are well-adapted for each other. It can be used for structured or unstructured pruning and we propose a number of specific methods for each. We compare our proposed methods to a number of contemporary neural network pruning methods and find that Gibbs pruning outperforms them. In particular, we achieve a new state-of-the-art result for pruning ResNet-56 with the CIFAR-10 dataset.