Filters








50,849 Hits in 2.9 sec

Improved boosting algorithms using confidence-rated predictions

Robert E. Schapire, Yoram Singer
1998 Proceedings of the eleventh annual conference on Computational learning theory - COLT' 98  
We describe several improvements to Freund and Schapire's AdaBoost boosting algorithm, particularly in a setting in which hypotheses may assign confidences to each of their predictions.  ...  We give a specific method for assigning confidences to the predictions of decision trees, a method closely related to one used by Quinlan.  ...  Thanks also to Peter Bartlett for showing us the bound on generalization error in Section 5 using pseudodimension, and to Roland Freund and Tommi Jaakkola for useful comments on numerical methods.  ... 
doi:10.1145/279943.279960 dblp:conf/colt/SchapireS98 fatcat:4jthcn64ejek3ayhal5k66nf7q

Confidence-based multiclass AdaBoost for physical activity monitoring

Attila Reiss, Didier Stricker, Gustaf Hendeby
2013 Proceedings of the 17th annual international symposium on International symposium on wearable computers - ISWC '13  
Therefore, this paper proposes the ConfAdaBoost.M1 algorithm. The proposed algorithm is a variant of the AdaBoost.M1 that incorporates well established ideas for confidence based boosting.  ...  The method is compared to the most commonly used boosting methods using benchmark datasets from the UCI machine learning repository and it is also evaluated on an activity recognition and an intensity  ...  Furthermore, the new algorithm uses the information about how confident the weak learners are to predict the class of the instances.  ... 
doi:10.1145/2493988.2494325 dblp:conf/iswc/ReissSH13 fatcat:6gt26t3cebeolnmv57qsi2fypu

A novel confidence-based multiclass boosting algorithm for mobile physical activity monitoring

Attila Reiss, Gustaf Hendeby, Didier Stricker
2014 Personal and Ubiquitous Computing  
Moreover, the confidence values are also used in the prediction part of the algorithm: the more confident the weak learner is in a new instance's prediction the more it counts in the output of the combined  ...  Furthermore, the new algorithm uses the information about how confident the weak learners are to predict the class of the instances.  ... 
doi:10.1007/s00779-014-0816-x fatcat:46wkudfhenchzpouimcfncwn3i

Boosting Trees for Anti-Spam Email Filtering [article]

Xavier Carreras, Lluis Marquez
2001 arXiv   pre-print
Several variants of the AdaBoost algorithm with confidence-rated predictions [Schapire & Singer, 99] have been applied, which differ in the complexity of the base learners considered.  ...  Two main conclusions can be drawn from our experiments: a) The boosting-based methods clearly outperform the baseline learning algorithms (Naive Bayes and Induction of Decision Trees) on the PU1 corpus  ...  In this paper, we show that the AdaBoost algorithm with confidence-rated predictions is a very well suited algorithm for addressing the spam filtering problem.  ... 
arXiv:cs/0109015v1 fatcat:jc3xww54irbptanskfjcn4t574

Improved prediction rule ensembling through model-based data generation [article]

Benny Markovitch, Marjolein Fokkema
2021 arXiv   pre-print
The results indicate that the use of surrogacymodels can substantially improve the sparsity of PRE, while retaining predictive accuracy, especiallythrough the use of a nested surrogacy approach.  ...  This article examines the use of surrogate modelsto improve performance of PRE, wherein the Lasso regression is trained with the help of a massivedataset generated by the (boosted) decision tree ensemble  ...  However, attempts to use class predictions instead of logit values did not lead to improvements in accuracy.  ... 
arXiv:2109.13672v1 fatcat:4buwsaj2a5azfcb7dsup7fwdyq

Classification Algorithm Accuracy Improvement for Student Graduation Prediction Using Ensemble Model

Ace C. Lagman, the FEU Institute of Technology, P. Paredes St. Sampaloc, Manila, Philippines, Lourwel P. Alfonso, Marie Luvett I. Goh, Jay-ar P. Lalata, Juan Paulo H. Magcuyao, Heintjie N. Vicente
2020 International Journal of Information and Education Technology  
The accuracy rate of the student graduation using models combination boosted to 87.60% E.  ...  To improve accuracy rate of test sets instances of logistic regression, combinations of predictions of set of classifiers were tested. The experiments were done using WEKA using majority of votes.  ... 
doi:10.18178/ijiet.2020.10.10.1449 fatcat:dcptmdrpwncevj47kbf5zhxjbu

Application specific loss minimization using gradient boosting

Bin Zhang, Abhinav Sethy, Tara N. Sainath, Bhuvana Ramabhadran
2011 2011 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)  
We also show that this novel framework is useful in identifying regions of high word error rate (WER) and can provide up to 20% relative improvement depending on the chosen operating point.  ...  We also extend the original gradient boosting algorithm with Newton-Raphson method to speed up learning.  ...  Comparison of gradient boosting and other methods Identifying High Error Rate Decodes Predicting WER of ASR hypotheses is useful for controlling the error rate.  ... 
doi:10.1109/icassp.2011.5947449 dblp:conf/icassp/ZhangSSR11 fatcat:thm5t4aodff27p542yijlth4mu

Boosting in Location Space [article]

Damian Eads, David Helmbold, Ed Rosten
2013 arXiv   pre-print
Location-based boosting differs from previous boosting algorithms because it optimizes a new spatial loss function to combine object detectors, each of which may have marginal performance, into a single  ...  Here we introduce a new concept: location-based boosting.  ...  Practical application of location-based boosting to detecting small objects Location-based boosting is an abstract algorithm; it creates an ensemble using a source of confidence-rated detectors.  ... 
arXiv:1309.1080v1 fatcat:sr3swpnchrffxgyyu77o4j6ch4

Collaborative filtering with collective training

Yong Ge, Hui Xiong, Alexander Tuzhilin, Qi Liu
2011 Proceedings of the fifth ACM conference on Recommender systems - RecSys '11  
Essentially, the collective training paradigm builds multiple different Collaborative Filtering (CF) models separately, and augments the training ratings of each CF model by using the partial predictions  ...  One way to address this rating sparsity problem is to develop more effective methods for training rating prediction models.  ...  The Tri-CF Algorithm In this subsection, we introduce Tri-CF algorithm, which is based on item-oriented KNN (iKNN), uKNN and SVD and boosts one CF model with the augmented ratings generated from the predictions  ... 
doi:10.1145/2043932.2043983 dblp:conf/recsys/GeXTL11 fatcat:naa4lexlovhpjbya7sbbb3slya

Active online confidence boosting for efficient object classification

Dennis Mund, Rudolph Triebel, Daniel Cremers
2015 2015 IEEE International Conference on Robotics and Automation (ICRA)  
Our underlying classifier is from the family of boosting methods, but in contrast to earlier methods, our Confidence Boosting particularly focusses on misclassified samples that have a high classification  ...  We present a novel e cient algorithm for object classification.  ...  Extension to Confidence Boosting As can be seen from Eq. (5), the agreement a g used by standard gradient boost is only related to the prediction itself, but not to the confidence of the prediction.  ... 
doi:10.1109/icra.2015.7139368 dblp:conf/icra/MundTC15 fatcat:mwasjngtzfcsjjcsexowxn2mkq

Track Boosting and Synthetic Data Aided Drone Detection [article]

Fatih Cagatay Akyon, Ogulcan Eryuksel, Kamil Anil Ozfuttu, Sinan Onur Altinuc
2021 arXiv   pre-print
Our method approaches the drone detection problem by fine-tuning a YOLOv5 model with real and synthetically generated data using a Kalman-based object tracker to boost detection confidence.  ...  As the usage of drones increases with lowered costs and improved drone technology, drone detection emerges as a vital object detection task.  ...  Object Tracking And Tracker Based Confidence Boosting Object tracking algorithms are used to provide continuity of object detections over time.  ... 
arXiv:2111.12389v2 fatcat:ji7ztulnlzby7ljxgrr3kdftfe

Pairwise Classification as an Ensemble Technique [chapter]

Johannes Fürnkranz
2002 Lecture Notes in Computer Science  
The performance gain is not as large as for bagging and boosting, but on the other hand round robin ensembles have a clearly defined semantics.  ...  In particular, we show that the use of round robin ensembles will also increase the classification performance of decision tree learners, even though they can directly handle multi-class problems.  ...  confidence in its prediction.  ... 
doi:10.1007/3-540-36755-1_9 fatcat:ymtndsjwxfem3ecokkrvjcduru

Genetic boosting classification for malware detection

Alejandro Martin, Hector D. Menendez, David Camacho
2016 2016 IEEE Congress on Evolutionary Computation (CEC)  
In the last few years, virus writers have made use of new obfuscation techniques with the ultimate aim of hindering malware from being detected and making the detection more complicated.  ...  We have used this benchmark dataset as a starting point to improve these results using Genetic Boosting.  ...  While existing evidence show the difficulty of achieve high accuracy rates on Malware classification, it is justified the use of more complex classification models, as a Boosting algorithm, to overcome  ... 
doi:10.1109/cec.2016.7743902 dblp:conf/cec/MartinMC16 fatcat:ndb44at4afbo7jpcswy55rwk7y

Robust Object Tracking based on Detection with Soft Decision

Bo Wu, Li Zhang, Vivek Kumar Singh, Ram Nevatia
2008 2008 IEEE Workshop on Motion and video Computing  
Object trajectories are initialized from the responses of higher confidence; hypothesized objects are tracked by associating with all the responses in the order of their confidence levels.  ...  Responses of different confidence levels are generated by classifiers with different complexities.  ...  To improve the robustness of object tracking, we propose a method that makes "soft decisions" at the detection stage by producing detection responses of different confidence levels and uses these confidence  ... 
doi:10.1109/wmvc.2008.4544052 fatcat:4wuj63qjjfb67a7e52ehainwpm

Generating highly accurate prediction hypotheses through collaborative ensemble learning

Nino Arsov, Martin Pavlovski, Lasko Basnarkov, Ljupco Kocarev
2017 Scientific Reports  
Ensemble techniques 13-15 show improved accuracy of predictive analytics and data mining applications.  ...  Ensemble generation is a natural and convenient way of achieving better generalization performance of learning algorithms by gathering their predictive capabilities.  ...  The algorithms were tested on various datasets, showing improved performance in both reducing the error rates and reducing the computation time.  ... 
doi:10.1038/srep44649 pmid:28304378 pmcid:PMC5356335 fatcat:kymz6d7fofcnnfs3gtjwxjse5u
« Previous Showing results 1 — 15 out of 50,849 results