Filter Pruning using Hierarchical Group Sparse Regularization for Deep Convoluti

condensation5080 26 0 .pdf 2021-01-24 07:01:56

Filter Pruning using Hierarchical Group Sparse Regularization for Deep Convolutional Neural Networks

Since the convolutional neural networks are often trained with redundant parameters, it is possible to reduce redundant kernels or filters to obtain a compact network without dropping the classification accuracy. In this paper, we propose a filter pruning method using the hierarchical group sparse regularization.It is shown in our previous work that the hierarchical group sparse regularization is effective in obtaining sparse networks in which filters connected to unnecessary channels are automatically close to zero. After training the convolutional neural network with the hierarchical group sparse regularization, the unnecessary filters are selected based on the increase of the classification loss of the randomly selected training samples to obtain a compact network. It is shown that the proposed method can reduce more than 50% parameters of ResNet for CIFAR-10 with only 0.3% decrease in the accuracy of test samples. Also, 34% parameters of ResNet are reduced for TinyImageNet-200 with higher accuracy than the baseline network.

深层卷积神经网络的分层稀疏正则化过滤修剪

由于卷积神经网络通常使用冗余参数进行训练,因此有可能减少冗余内核或过滤器以获得紧凑的网络而不会降低分类精度。在本文中,我们提出了一种使用分层组稀疏正则化的过滤修剪方法。.. 在我们以前的工作中表明,分层组稀疏正则化对于获得稀疏网络是有效的,在稀疏网络中,连接到不必要通道的滤波器自动接近零。在用分层组稀疏正则化训练卷积神经网络后,基于随机选择的训练样本的分类损失的增加来选择不必要的滤波器,以获得紧凑的网络。结果表明,所提出的方法可以减少CIFAR-10的ResNet参数超过50%,而测试样品的准确性仅降低0.3%。另外,对于TinyImageNet-200,ResNet的参数降低了34%,其精度高于基准网络。 (阅读更多)

用户评论
请输入评论内容
评分:
暂无评论