A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2016; you can also visit the original URL.
The file type is application/pdf
.
How boosting the margin can also boost classifier complexity
2006
Proceedings of the 23rd international conference on Machine learning - ICML '06
2 The Learning Task Given m training examples and their labels Predict the label of a new example 3 The Idea of Boosting Combine many "moderately inaccurate" base classifiers into a combined predictor Generate a new base classifier in each round Constantly focus on the hardest examples The final predictor is the weighted vote of the base classifiers AdaBoost sets voting weights of each new base classifier to reduce an upper bound on the training error.
doi:10.1145/1143844.1143939
dblp:conf/icml/ReyzinS06
fatcat:nxe3xvq3tjcnveeusyhns5deoe