A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is
We investigate statistical properties for a broad class of modern kernel-based regression (KBR) methods. These kernel methods were developed during the last decade and are inspired by convex risk minimization in infinite-dimensional Hilbert spaces. One leading example is support vector regression. We first describe the relationship between the loss function $L$ of the KBR method and the tail of the response variable. We then establish the $L$-risk consistency for KBR which gives thedoi:10.3150/07-bej5102 fatcat:spm5em6jv5hpdp4uokz6ssajee