A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Structured Pruning for Efficient ConvNets via Incremental Regularization
2019
2019 International Joint Conference on Neural Networks (IJCNN)
Parameter pruning is a promising approach for CNN compression and acceleration by eliminating redundant model parameters with tolerable performance loss. Despite its effectiveness, existing regularization-based parameter pruning methods usually drive weights towards zero with large and constant regularization factors, which neglects the fact that the expressiveness of CNNs is fragile and needs a more gentle way of regularization for the networks to adapt during pruning. To solve this problem,
doi:10.1109/ijcnn.2019.8852463
dblp:conf/ijcnn/WangZWYH19
fatcat:pusjdcff5nertcxcq2b3jl2sea