Survey of Dropout Methods for Deep Neural Networks [article]

Alex Labach, Hojjat Salehinejad, Shahrokh Valaee
2019 arXiv   pre-print
Dropout methods are a family of stochastic techniques used in neural network training or inference that have generated significant research interest and are widely used in practice. They have been successfully applied in neural network regularization, model compression, and in measuring the uncertainty of neural network outputs. While original formulated for dense neural network layers, recent advances have made dropout methods also applicable to convolutional and recurrent neural network
more » ... . This paper summarizes the history of dropout methods, their various applications, and current areas of research interest. Important proposed methods are described in additional detail.
arXiv:1904.13310v2 fatcat:psmwelybpfb7jo57pnafgl6ale