A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
In-place Activated BatchNorm for Memory-Optimized Training of DNNs
2018
2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition
In this work we present In-Place Activated Batch Normalization (INPLACE-ABN) -a novel approach to drasti cally reduce the training memory footprint of modern deep neural networks in a computationally efficient way. Our solution substitutes the conventionally used succession of BatchNorm + Activation layers with a single plugin layer, hence avoiding invasive framework surgery while providing straightforward applicability for existing deep learning frameworks. We obtain memory savings of up to
doi:10.1109/cvpr.2018.00591
dblp:conf/cvpr/BuloPK18
fatcat:inqltbivvjcurecpbc7fca24cy