A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2022; you can also visit the original URL.
The file type is application/pdf
.
MemNAS: Memory-Efficient Neural Architecture Search With Grow-Trim Learning
2020
2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)
Recent studies on automatic neural architecture search techniques have demonstrated significant performance, competitive to or even better than hand-crafted neural architectures. However, most of the existing search approaches tend to use residual structures and a concatenation connection between shallow and deep features. A resulted neural network model, therefore, is non-trivial for resource-constraint devices to execute since such a model requires large memory to store network parameters and
doi:10.1109/cvpr42600.2020.00218
dblp:conf/cvpr/Liu0MS20
fatcat:swcu66ybb5cqzfy3vsr4pkdnqa