Channel Pruning via Multi-Criteria based on Weight Dependency

qqextinction58774 9 0 .pdf 2021-01-24 07:01:49

Channel Pruning via Multi-Criteria based on Weight Dependency

Channel pruning has demonstrated its effectiveness in compressing ConvNets. In many prior arts, the importance of an output feature map is only determined by its associated filter.However, these methods ignore a small part of weights in the next layer which disappears as the feature map is removed. They ignore the dependency of the weights. In addition, many pruning methods use only one criterion for evaluation, and find a sweet-spot of pruning structure and accuracy in a trial-and-error fashion, which can be time-consuming. To address the above issues, we proposed a channel pruning algorithm via multi-criteria based on weight dependency, CPMC, which can compress a variety of models efficiently. We design the importance of the feature map in three aspects, including its associated weight value, computational cost, and parameter quantity. Use the phenomenon of weight dependency, We get the importance by assessing its associated filter and the corresponding partial weights of the next layer. Then we use global normalization to achieve cross-layer comparison. Our method can compress various CNN models, including VGGNet, ResNet, and DenseNet, on various image classification datasets. Extensive experiments have shown CPMC outperforms the others significantly.

通过基于权重的多标准修剪频道

通道修剪已证明其在压缩ConvNets方面的有效性。在许多现有技术中,输出特征图的重要性仅由其相关联的滤波器确定。.. 但是,这些方法忽略了下一层中的一小部分权重,该权重在删除特征图时消失。他们忽略了权重的依赖性。此外,许多修剪方法仅使用一个标准进行评估,并以试错法找到修剪结构和准确性的最佳点,这可能很耗时。针对上述问题,我们提出了一种基于权重依赖的基于多准则的信道修剪算法CPMC,该算法可以有效地压缩各种模型。我们从三个方面设计特征图的重要性,包括其相关的权重值,计算成本和参数数量。利用权重依赖现象,我们通过评估其关联的过滤器和下一层的相应部分权重来获得重要性。然后,我们使用全局归一化来实现跨层比较。我们的方法可以在各种图像分类数据集上压缩各种CNN模型,包括VGGNet,ResNet和DenseNet。大量实验表明,CPMC的性能明显优于其他产品。 (阅读更多)

用户评论
请输入评论内容
评分:
暂无评论