A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Generalized Core Vector Machines
2006
IEEE Transactions on Neural Networks
Kernel methods, such as the support vector machine (SVM), are often formulated as quadratic programming (QP) problems. However, given training patterns, a naive implementation of the QP solver takes ( 3 ) training time and at least ( 2 ) space. Hence, scaling up these QPs is a major stumbling block in applying kernel methods on very large data sets, and a replacement of the naive method for finding the QP solutions is highly desirable. Recently, by using approximation algorithms for the minimum
doi:10.1109/tnn.2006.878123
pmid:17001975
fatcat:syoc3vvwb5h2bmho7rxzrv46vi