A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is application/pdf
.
Incremental Layers Resection: A Novel Method to Compress Neural Networks
2019
IEEE Access
In recent years, deep neural networks (DNNs) have been widely applied in many areas, such as computer vision and pattern recognition. However, we observe that most of the DNNs include redundant layers. Hence, in this paper, we introduce a novel method named incremental layers resection (ILR) to resect the redundant layers in DNNs, while preserving their learning performances. ILR uses a multistage learning strategy to incrementally resect the inconsequential layers. In each stage, it preserves
doi:10.1109/access.2019.2952615
fatcat:ipm6yjvhkzfvzcmmyavm2wxgoq