A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
Towards Unified INT8 Training for Convolutional Neural Network
2020
2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
Recently low-bit (e.g., 8-bit) network quantization has been extensively studied to accelerate the inference. Besides inference, low-bit training with quantized gradients can further bring more considerable acceleration, since the backward process is often computation-intensive. Unfortunately, the inappropriate quantization of backward propagation usually makes the training unstable and even crash. There lacks a successful unified low-bit training framework that can support diverse networks on
doi:10.1109/cvpr42600.2020.00204
dblp:conf/cvpr/ZhuGYLWLYY20
fatcat:7ujbnvuumrbp5ogz7vgxrxukl4