Filters








32,591 Hits in 3.2 sec

Using boosting to prune bagging ensembles

Gonzalo Martínez-Muñoz, Alberto Suárez
2007 Pattern Recognition Letters  
In problems where boosting is superior to bagging, these improvements are not sufficient to reach the accuracy of the corresponding boosting ensembles.  ...  However, ensemble pruning preserves the performance of bagging in noisy classification tasks, where boosting often has larger generalization errors.  ...  The classifier that minimizes the weighted training error is then incorporated into the ensemble.  ... 
doi:10.1016/j.patrec.2006.06.018 fatcat:4274htskqrcvlkpglhtq44puwy

Instance Significance Guided Multiple Instance Boosting for Robust Visual Tracking [article]

Jinwu Liu and Yao Lu and Tianfei Zhou
2020 arXiv   pre-print
In this paper, we extend this idea towards incorporating the instance significance estimation into the online MILBoost framework.  ...  Next, we follow the online boosting framework, and propose a new criterion for the selection of weak classifiers.  ...  The central idea behind our approach is learning the significance of instances, which we call significance-coefficients, and incorporating them into the bag likelihood to guide the selection of weak classifiers  ... 
arXiv:1501.04378v5 fatcat:dvqqzfmw3ba4xg7b4l5w6ghzpq

Bagged Boosted Trees for Classification of Ecological Momentary Assessment Data [article]

Gerasimos Spanakis and Gerhard Weiss and Anne Roefs
2016 arXiv   pre-print
We propose a new algorithm called BBT (standing for Bagged Boosted Trees) that is enhanced by a over/under sampling method and can provide better estimates for the conditional class probability function  ...  Ecological Momentary Assessment (EMA) data is organized in multiple levels (per-subject, per-day, etc.) and this particular structure should be taken into account in machine learning algorithms used in  ...  By this way, we manage to incorporate advantages of subject-based bootstrapping and observation-based bootstrapping into the final BBT ensemble.  ... 
arXiv:1607.01582v1 fatcat:v5iftcuhbrbibbvslbuevfbkse

Confidence-Rated Multiple Instance Boosting for Object Detection

Karim Ali, Kate Saenko
2014 2014 IEEE Conference on Computer Vision and Pattern Recognition  
Our approach consists in first obtaining confidence estimates over the label space and, second, incorporating these estimates within a new Boosting procedure.  ...  In the second step, the obtained confidence estimates are incorporated into a generalized MILBoost procedure.  ...  Next, we incorporate the estimates within a new Boosting procedure, built by generalizing the MILBoost loss to incorporate a prior over the latent space and applying Friedman's gradient Boosting [11]  ... 
doi:10.1109/cvpr.2014.312 dblp:conf/cvpr/AliS14 fatcat:g5siops355gttca73ltrxoy4yi

Gradient Boosted Feature Selection [article]

Zhixiang Eddie Xu, Gao Huang, Kilian Q. Weinberger, Alice X. Zheng
2019 arXiv   pre-print
algorithm should ideally satisfy four conditions: reliably extract relevant features; be able to identify non-linear feature interactions; scale linearly with the number of features and dimensions; allow the incorporation  ...  The algorithm is flexible, scalable, and surprisingly straight-forward to implement as it is based on a modification of Gradient Boosted Trees.  ...  Finally, while not a feature selection method, [34] employ Gradient Boosted Trees to learn cascades of classifiers to reduce test-time cost by incorporating feature extraction budgets into the classifier  ... 
arXiv:1901.04055v1 fatcat:eq6mw6f4dndrjeh4s4x5k6vwmu

Gradient boosted feature selection

Zhixiang Xu, Gao Huang, Kilian Q. Weinberger, Alice X. Zheng
2014 Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining - KDD '14  
algorithm should ideally satisfy four conditions: reliably extract relevant features; be able to identify non-linear feature interactions; scale linearly with the number of features and dimensions; allow the incorporation  ...  The algorithm is flexible, scalable, and surprisingly straight-forward to implement as it is based on a modification of Gradient Boosted Trees.  ...  Finally, while not a feature selection method, [31] employ Gradient Boosted Trees to learn cascades of classifiers to reduce test-time cost by incorporating feature extraction budgets into the classifier  ... 
doi:10.1145/2623330.2623635 dblp:conf/kdd/XuHWZ14 fatcat:digaumqncbdf3m2gmadjznmf3q

Introducing A Hybrid Data Mining Model to Evaluate Customer Loyalty

H. Alizadeh, B. Minaei Bidgoli
2016 Engineering, Technology & Applied Science Research  
and boosting to predict the class of loyal customers.  ...  The results showed that the bagging-ANN was the most accurate method in predicting loyal customers.  ...  Then the customer data is integrated; that is the primary dataset containing transactional customer variables is incorporated into a table. Thus, every customer should have a data record.  ... 
doi:10.48084/etasr.741 fatcat:7rfhxjl3gvedhmcpypiehs2uuy

Software Defect Prediction Using Ensemble Learning: An ANP Based Evaluation Method

Abdullateef O Balogun, Amos O Bajeh, Victor A Orie, Ayisat W Yusuf-Asaju
2018 FUOYE Journal of Engineering and Technology  
This paper evaluated the performance of single classifiers (SMO, MLP, kNN and Decision Tree) and ensembles (Bagging, Boosting, Stacking and Voting) in SDP considering major performance metrics using Analytic  ...  These clearly show that ensemble methods can give better classification results in SDP and Boosting method gave the best result.  ...  11 Boosted J48 0.0289 12 Bagged KNN 0.0230 13 KNN 0.0164  ... 
doi:10.46792/fuoyejet.v3i2.200 fatcat:3b2eaocmkfa3nnuqkek7sxptce

Use of Artificial Intelligence for Predicting Parameters of Sustainable Concrete and Raw Ingredient Effects and Interactions

Muhammad Nasir Amin, Waqas Ahmad, Kaffayatullah Khan, Ayaz Ahmad, Sohaib Nazar, Anas Abdulalim Alabdullah
2022 Materials  
Incorporating waste material, such as recycled coarse aggregate concrete (RCAC), into construction material can reduce environmental pollution.  ...  The R2 value of 0.98 from DT-Gradient Boosting supersedes those of the other methods, i.e., DT- XG Boost, SVM-Bagging, and SVM-Adaboost.  ...  (a) SVM-AdaBoost; (b) SVM-Bagging; (c) DT-Gradient Boosting; (d) DT-XG-Boost. Figure 19 . 19 Figure 19. Sub-models. (a) SVM-AdaBoost; (b) SVM-Bagging; (c) DT-Gradient Boosting; (d) DT-XG-Boost.  ... 
doi:10.3390/ma15155207 pmid:35955144 pmcid:PMC9369900 fatcat:55n4pe7jvbcbtggnhxzd7uuqkq

Boosting SVM classifiers by ensemble

Yan-Shi Dong, Ke-Song Han
2005 Special interest tracks and posters of the 14th international conference on World Wide Web - WWW '05  
We try to attack this problem by ensemble methods, which are often used for boosting weak classifiers, such as decision tree, neural networks, etc., and whether they are effective for strong classifiers  ...  Besides, we incorporate biased sampling into these partitioning methods, i.  ...  Part, fold, clust, bag and boost represents disjunct, fold and clustering data partitioning, as well as bagging and boosting, respectively.  ... 
doi:10.1145/1062745.1062874 dblp:conf/www/DongH05 fatcat:5xuc5s2qwrd3vdbtlqrdng3l4a

SGB-ELM: An Advanced Stochastic Gradient Boosting-Based Ensemble Scheme for Extreme Learning Machine

Hua Guo, Jikui Wang, Wei Ao, Yulin He
2018 Computational Intelligence and Neuroscience  
Instead of incorporating the stochastic gradient boosting method into ELM ensemble procedure primitively, SGB-ELM constructs a sequence of weak ELMs where each individual ELM is trained additively by optimizing  ...  A novel ensemble scheme for extreme learning machine (ELM), named Stochastic Gradient Boosting-based Extreme Learning Machine (SGB-ELM), is proposed in this paper.  ...  In view of that, a minor modification named stochastic gradient boosting is proposed to incorporate some randomization to the procedure.  ... 
doi:10.1155/2018/4058403 pmid:30046300 pmcid:PMC6038681 fatcat:wpjivicqkfd4hoi4ngjigfdlme

Performance Analysis of Classification Algorithms on Birth Dataset

Syed Ali Abbas, Aqeel ur Rehman, Fiaz Majeed, Abdul Majid, M. Sheraz Arshed Malik, Zaki Hassan Kazmi, Seemab Zafar
2020 IEEE Access  
INDEX TERMS Cesarean-section, machine learning, bagging, classification, boosting, health care. 102146 This work is licensed under a Creative Commons Attribution 4.0 License.  ...  In current study, we intend to utilize the learning capability of machine learning methods towards the classification of birth data using bagging and boosting classification algorithms.  ...  The methods incorporated include different variants of bagging and boosting.  ... 
doi:10.1109/access.2020.2999899 fatcat:nbddtjw7rfe7loaieui3i3hv5m

Exploring Ensemble-Based Class Imbalance Learners for Intrusion Detection in Industrial Control Networks

Maya Hilda Lestari Louk, Bayu Adhi Tama
2021 Big Data and Cognitive Computing  
oversampling bagging, random undersampling bagging, synthetic minority oversampling bagging, random undersampling boosting, synthetic minority oversampling boosting, AdaC2, and EasyEnsemble.  ...  Furthermore, undersampling and oversampling strategies were effective in a boosting-based ensemble but not in a bagging-based ensemble.  ...  It incorporates SMOTE into the Adaboost algorithm. SMOTE creates new minority samples by extrapolating previous samples.  ... 
doi:10.3390/bdcc5040072 fatcat:wx6wqsrl4fd6jki773f76tehja

Regression trees for predicting mortality in patients with cardiovascular disease: What improvement is achieved by using ensemble-based methods?

Peter C. Austin, Douglas S. Lee, Ewout W. Steyerberg, Jack V. Tu
2012 Biometrical Journal  
We aimed to evaluate the improvement that is achieved by using ensemble-based methods, including bootstrap aggregation (bagging) of regression trees, random forests, and boosted regression trees.  ...  However, conventional logistic regression models that incorporated restricted cubic smoothing splines had even better performance.  ...  Logistic regression resulted in predicted probabilities of 30-day death that ranged from 0.001 to 0.964 (0.001-0.961 when smoothing splines were incorporated into the model).  ... 
doi:10.1002/bimj.201100251 pmid:22777999 pmcid:PMC3470596 fatcat:qs3cwwslenae3haqqkijnps55i

A model for improved performance prediction using ensemble-based hybrid classification approach on a multivariate student dataset

M Anoopkumar, A. M. J. Md. Zubair Rahman
2018 International Journal of Engineering & Technology  
Here, the basic Ensemble methods such as Bagging, Classification Boosting and Stacking are used for optimising the results with more precision.  ...  The mining process with new attributes based on student behaviours has also been incorporated since it creates a great impact on their academic performances.  ...  The overall working demonstration of bagging technique is illustrated in the Boosting Boosting is a slightly different technique from bagging.  ... 
doi:10.14419/ijet.v7i4.23542 fatcat:tw53vj2ptrh33fxnt3m7g4tzhi
« Previous Showing results 1 — 15 out of 32,591 results