Fast rates for support vector machines using Gaussian kernels

Ingo Steinwart, Clint Scovel
2007 Annals of Statistics  
For binary classification we establish learning rates up to the order of $n^{-1}$ for support vector machines (SVMs) with hinge loss and Gaussian RBF kernels. These rates are in terms of two assumptions on the considered distributions: Tsybakov's noise assumption to establish a small estimation error, and a new geometric noise condition which is used to bound the approximation error. Unlike previously proposed concepts for bounding the approximation error, the geometric noise assumption does
more » ... assumption does not employ any smoothness assumption.
doi:10.1214/009053606000001226 fatcat:chcwz2kcobatbpyertxbr5m6je