A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Fixed-Point Back-Propagation Training
2020
2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
Recent emerged quantization technique (i.e., using low bit-width fixed-point data instead of high bit-width floatingpoint data) has been applied to inference of deep neural networks for fast and efficient execution. However, directly applying quantization in training can cause significant accuracy loss, thus remaining an open challenge. In this paper, we propose a novel training approach, which applies a layer-wise precision-adaptive quantization in deep neural networks. The new training
doi:10.1109/cvpr42600.2020.00240
dblp:conf/cvpr/ZhangLZLHZGGDZC20
fatcat:lbgxpo7xlfgctnxt5m5y3oajim