Composability-Centered Convolutional Neural Network Pruning [report]

Hui Guan, Xipeng Shen, Seung-Hwan Lim, Robert M. Patton
2018 unpublished
This work studies the composability of the building blocks of structural CNN models (e.g., GoogleLeNet and Residual Networks) in the context of network pruning. We empirically validate that a network composed of pre-trained building blocks (e.g. residual blocks and Inception modules) not only gives a better initial setting for training, but also allows the training process to converge at a significantly higher accuracy in much less time. Based on that insight, we propose a
more » ... design for CNN network pruning. Experiments show that this new scheme shortens the configuration process in CNN network pruning by up to 186.8X for ResNet-50 and up to 30.2X for Inception-V3, and meanwhile, the models it finds that meet the accuracy requirement are significantly more compact than those found by default schemes.
doi:10.2172/1427608 fatcat:mvov3comnndq5kmqs67l2t4bji