A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Activation Functions: Comparison of trends in Practice and Research for Deep Learning
[article]
2018
arXiv
pre-print
Deep neural networks have been successfully used in diverse emerging domains to solve real world complex problems with may more deep learning(DL) architectures, being developed to date. To achieve these state-of-the-art performances, the DL architectures use activation functions (AFs), to perform diverse computations between the hidden layers and the output layers of any given DL architecture. This paper presents a survey on the existing AFs used in deep learning applications and highlights the
arXiv:1811.03378v1
fatcat:qfwc3ywnarhi7gdnzhbcks2mom