ISBNet:实例感知选择性分支网络

rigidity5517 10 0 .pdf 2021-01-22 15:01:13

近年来,目睹了对设计高效神经网络和神经体系结构搜索(NAS)的兴趣日益浓厚。尽管已经实现了显着的效率和准确性,但是现有的专家设计和NAS模型忽略了以下事实:输入实例具有不同的复杂度,因此需要不同的计算量。..

ISBNet: Instance-aware Selective Branching Networks

Recent years have witnessed growing interests in designing efficient neural networks and neural architecture search (NAS). Although remarkable efficiency and accuracy have been achieved, existing expert designed and NAS models neglect the fact that input instances are of varying complexity and thus different amounts of computation are required.Inference with a fixed model that processes all instances through the same transformations would incur computational resources unnecessarily. Customizing the model capacity in an instance-aware manner is required to alleviate such a problem. In this paper, we propose a novel Instance-aware Selective Branching Network-ISBNet to support efficient instance-level inference by selectively bypassing transformation branches of insignificant importance weight. These weights are dynamically determined by a lightweight hypernetwork SelectionNet and recalibrated by gumbel-softmax for sparse branch selection. Extensive experiments show that ISBNet achieves extremely efficient inference in terms of parameter size and FLOPs comparing to existing networks. For example, ISBNet takes only 8.70% parameters and 31.01% FLOPs of the efficient network MobileNetV2 with comparable accuracy on CIFAR-10.

用户评论
请输入评论内容
评分:
暂无评论