Properly-weighted graph Laplacian for semi-supervised learning [article]

Jeff Calder, Dejan Slepcev
2019 arXiv   pre-print
The performance of traditional graph Laplacian methods for semi-supervised learning degrades substantially as the ratio of labeled to unlabeled data decreases, due to a degeneracy in the graph Laplacian. Several approaches have been proposed recently to address this, however we show that some of them remain ill-posed in the large-data limit. In this paper, we show a way to correctly set the weights in Laplacian regularization so that the estimator remains well posed and stable in the
more » ... e limit. We prove that our semi-supervised learning algorithm converges, in the infinite sample size limit, to the smooth solution of a continuum variational problem that attains the labeled values continuously. Our method is fast and easy to implement.
arXiv:1810.04351v2 fatcat:vjppffhzbfeelhzzluylfiyh5m