A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Convergence rate of stochastic k-means
[article]
2016
arXiv
pre-print
We analyze online and mini-batch k-means variants. Both scale up the widely used Lloyd 's algorithm via stochastic approximation, and have become popular for large-scale clustering and unsupervised feature learning. We show, for the first time, that they have global convergence towards local optima at O(1/t) rate under general conditions. In addition, we show if the dataset is clusterable, with suitable initialization, mini-batch k-means converges to an optimal k-means solution with O(1/t)
arXiv:1610.04900v2
fatcat:5zpfr3qxbjg4hebnbmgzud3y4e