A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Why ADAGRAD Fails for Online Topic Modeling
2017
Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing
Online topic modeling, i.e., topic modeling with stochastic variational inference, is a powerful and efficient technique for analyzing large datasets, and ADAGRAD is a widely-used technique for tuning learning rates during online gradient optimization. However, these two techniques do not work well together. We show that this is because ADAGRAD uses accumulation of previous gradients as the learning rates' denominators. For online topic modeling, the magnitude of gradients is very large. It
doi:10.18653/v1/d17-1046
dblp:conf/emnlp/LuLB17
fatcat:yr4tgeqg7rbphb7smyqjwzqebe