A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Random Projections for Linear Support Vector Machines
2014
ACM Transactions on Knowledge Discovery from Data
Let X be a data matrix of rank ρ, whose rows represent n points in d-dimensional space. The linear support vector machine constructs a hyperplane separator that maximizes the 1-norm soft margin. We develop a new oblivious dimension reduction technique that is precomputed and can be applied to any input matrix X. We prove that, with high probability, the margin and minimum enclosing ball in the feature space are preserved to within -relative error, ensuring comparable generalization as in the
doi:10.1145/2641760
fatcat:lk757dnq7vaoleqr46wdgy7rpq