Incremental Layers Resection: A Novel Method to Compress Neural Networks

Xiang Liu, Li-Na Wang, Wenxue Liu, Guoqiang Zhong, Junyu Dong
2019 IEEE Access  
In recent years, deep neural networks (DNNs) have been widely applied in many areas, such as computer vision and pattern recognition. However, we observe that most of the DNNs include redundant layers. Hence, in this paper, we introduce a novel method named incremental layers resection (ILR) to resect the redundant layers in DNNs, while preserving their learning performances. ILR uses a multistage learning strategy to incrementally resect the inconsequential layers. In each stage, it preserves
more » ... he data representations learned by the original network, while connecting the two nearby layers of each resected one. Particularly, based on a teacher-student knowledge transfer framework, we have designed the layer-level learning and overall learning procedures to enforce the resected network performing similarly with the original one. Extensive experiments demonstrate that, compared to the original networks, the compressed ones by ILR need only about half of the storage space and have higher inference speed. More importantly, they even deliver higher classification accuracy than the original networks.
doi:10.1109/access.2019.2952615 fatcat:ipm6yjvhkzfvzcmmyavm2wxgoq