Learning rates for the kernel regularized regression with a differentiable strongly convex loss

Baohuai Sheng, ,Department of Applied Statistics, Shaoxing University, Shaoxing, 312000, China, Huanxiang Liu, Huimin Wang
2020 Communications on Pure and Applied Analysis  
We consider learning rates of kernel regularized regression (KRR) based on reproducing kernel Hilbert spaces (RKHSs) and differentiable strongly convex losses and provide some new strongly convex losses. We first show the robustness with the maximum mean discrepancy (MMD) and the Hutchinson metric respectively, and, along this line, bound the learning rate of the KRR. We first provide a capacity dependent learning rate and then give the learning rates for four concrete strongly convex losses
more » ... ly convex losses respectively. In particular, we provide the learning rates when the hypothesis RKHS's logarithmic complexity exponent is arbitrarily small as well as sufficiently large. 2010 Mathematics Subject Classification. Primary: 90C25, 68Q32, 68T40; Secondary: 41A25.
doi:10.3934/cpaa.2020176 fatcat:hiqfkf5mdrfehezolxk5vxyaki