Filters








329 Hits in 8.3 sec

An Optimized Stacking Ensemble Model for Phishing Websites Detection

Mohammed Al-Sarem, Faisal Saeed, Zeyad Ghaleb Al-Mekhlafi, Badiea Abdulkarem Mohammed, Tawfik Al-Hadhrami, Mohammad T. Alshammari, Abdulrahman Alreshidi, Talal Sarheed Alshammari
2021 Electronics  
The optimisation was carried out using a genetic algorithm (GA) to tune the parameters of several ensemble machine learning methods, including random forests, AdaBoost, XGBoost, Bagging, GradientBoost,  ...  This paper proposes an optimized stacking ensemble method for phishing website detection.  ...  Acknowledgments: We would like to acknowledge the Scientific Research Deanship at the University of Ha'il, Saudi Arabia, for funding this research.  ... 
doi:10.3390/electronics10111285 fatcat:hw53cn73vbaezfl73ckknusjva

Ensemble learning for intrusion detection systems: A systematic mapping study and cross-benchmark evaluation

Bayu Adhi Tama, Sunghoon Lim
2021 Computer Science Review  
Furthermore, this study reports and analyzes an empirical investigation of a new classifier ensemble approach, called stack of ensemble (SoE) for anomaly-based IDS.  ...  To achieve a higher detection rate, the ability to design an improved detection framework is sought after, particularly when utilizing ensemble learners.  ...  [64] used a multi-objective genetic algorithm in order to determine Paretooptimal ensembles of base-level classifiers for intrusion detection and validated that Pareto-optimal ensembles outperform the  ... 
doi:10.1016/j.cosrev.2020.100357 fatcat:6vojyshrd5aencao6hfgcpd2mm

Predictive Ensemble Pruning by Expectation Propagation

Huanhuan Chen, P. Tiho, Xin Yao
2009 IEEE Transactions on Knowledge and Data Engineering  
Therefore, the LOO error is used together with the Bayesian evidence for model selection in this algorithm.  ...  Our results are very competitive compared with other ensemble pruning algorithms.  ...  Algorithms for Bagging, Adaboosting, and Random Forests TABLE 8 The 8 Mean Rank of These Algorithms with Different Ensemble Algorithms and Unpruned Ensemble TABLE 9 Friedman Tests with the Corresponding  ... 
doi:10.1109/tkde.2009.62 fatcat:x6y27tgf3nhdzm6sdeqfb4enxa

Detection of Suicidal Ideation on Twitter using Machine Learning & Ensemble Approaches

Syed Tanzeel Rabani, Qamar Rayees Khan, Akib Mohi UD Din Khanday
2020 Baghdad Science Journal  
However, the power of prediction for detecting genuine suicidality is not confirmed yet, and this study does not directly communicate and intervene the people having suicidal behaviour.  ...  Finally, various machine learning and ensemble methods are used to automatically distinguish Suicidal and Non-Suicidal tweets.  ...  The researchers used Naïve Bayes (NB) and Support Vector Machine (SVM) for classification. The model was optimized through various Genetic algorithms which attained the F score of 92.69%.  ... 
doi:10.21123/bsj.2020.17.4.1328 fatcat:j3gyrxgnanbpxbo6sajxqfb36u

Homogenous Multiple Classifier System for Software Quality Assessment Based on Support Vector Machine

Udoinyang G. Inyang, Olufemi S. Adeoye, Edward N. Udo, Edidiong F. Bassey, Enefiok A. Etuk, Fidelia N. Ugwoke, Emmanuel B. Usoro
2022 Computer and Information Science  
The combinations of the results from the multiple SVMs used AdaBoost, bagging, and random subspace ensemble methods for the assessment of SQ.  ...  All three ensemble learning methods performed better than the individual SVM, however, the bagging stood out with an accuracy of 93.0%.  ...  Research, University of Uyo and the University of Uyo management for the enabling environment.  ... 
doi:10.5539/cis.v15n3p47 fatcat:rmslefegdre2rpw6j256fefdey

Detecting epistatic effects in association studies at a genomic level based on an ensemble approach

J. Li, B. Horstman, Y. Chen
2011 Bioinformatics  
We extend the basic AdaBoost algorithm by incorporating an intuitive importance score based on Gini impurity to select candidate SNPs.  ...  We have performed extensive simulation studies using three interaction models to evaluate the efficacy of our approach at realistic GWAS sizes, and have compared it with existing epistatic detection algorithms  ...  ACKNOWLEDGEMENTS We thank the Ohio Supercomputer Center for an allocation of computing time. Conflict of Interest: none declared.  ... 
doi:10.1093/bioinformatics/btr227 pmid:21685074 pmcid:PMC3117367 fatcat:iiw62iqjznembhbmmim3wwcmz4

Weighted Random Forests to Improve Arrhythmia Classification

Krzysztof Gajowniczek, Iga Grzegorczyk, Tomasz Ząbkowski, Chandrajit Bajaj
2020 Electronics  
The main goals of this article are to propose a new weighting algorithm applicable for each tree in the Random Forest model and the comprehensive examination of the optimal parameter tuning.  ...  Construction of an ensemble model is a process of combining many diverse base predictive learners.  ...  [3] propose a systematic approach to find the optimal weights to create ensembles for bias-variance tradeoff using cross-validation for regression problems (Cross-validated Optimal Weighted Ensemble  ... 
doi:10.3390/electronics9010099 pmid:32051761 pmcid:PMC7015067 fatcat:2qfopijjsraxtky7yxyxu27inm

Fault Detection and Classification in Transmission Lines Connected to Inverter-Based Generators Using Machine Learning

Khalfan Al Kharusi, Abdelsalam El Haffar, Mostefa Mesbah
2022 Energies  
The forward feature selection combined with the Bag ensemble classifier achieved 100% accuracy, sensitivity, specificity, and precision for fault detection (binary classification), while the Adaboost ensemble  ...  algorithm.  ...  feature selection for fault detection and Adaboost ensemble with the complete set of features of current signals for fault classification).  ... 
doi:10.3390/en15155475 fatcat:tehtkyr5nngovbltxzfe5zzdh4

Supervised and Unsupervised Machine Learning for Improved Identification of Intrauterine Growth Restriction Types

Agnieszka Wosiak, Agata Zamecznik, Katarzyna Niewiadomska-Jarosik
2016 Proceedings of the 2016 Federated Conference on Computer Science and Information Systems  
, Naïve Bayes, random tree and sequential minimal optimization algorithm for training support vector machines.  ...  Supervised learning techniques included bagging with Naïve Bayes, k-nearest neighbours (kNN), C4.5 and SMO as base classifiers, random forest as a variant of bagging with a decision tree as a base classifier  ...  The basic AdaBoost algorithm deals with binary classification.  ... 
doi:10.15439/2016f515 dblp:conf/fedcsis/WosiakZN16 fatcat:gyhlo7xzxjhn5gacvspfuj5uui

Data science in economics: comprehensive review of advanced machine learning and deep learning methods

Saeed Nosratabadi, Amir Mosavi, Puhong Duan, Pedram Ghamisi, Ferdinand Filip, Shahab S. Band, Uwe Reuter, Joao Gama, Amir H. Gandomi
2020 Zenodo  
The findings reveal that the trends follow the advancement of hybrid models, which, based on the accuracy metric, outperform other learning algorithms.  ...  The analysis was performed on novel data science methods in four individual classes of deep learning models, hybrid deep learning models, hybrid machine learning, and ensemble models.  ...  [93] respectively applied Bagged-pSVM and Boosted-pSVM models, a genetic algorithm with the naïve Bayes and SVM, SMOTE-AdaBoost-REP Tree to predict corporate bankruptcy. Ullah et al.  ... 
doi:10.5281/zenodo.4087812 fatcat:4flgeabkxvgjrpbydfby3v6tua

A Review of Ensemble Methods in Bioinformatics

Pengyi Yang, Yee Hwa Yang, Bing B. Zhou, Albert Y. Zomaya
2010 Current Bioinformatics  
Recent work in computational biology has seen an increasing use of ensemble learning methods due to their unique advantages in dealing with small sample size, high-dimensionality, and complexity data structures  ...  Promising directions such as ensemble of support vector machine, meta-ensemble, and ensemble based feature selection are discussed.  ...  Acknowledgement We thank Professor Joachim Gudmundsson for critical comments and constructive suggestions which have greatly improve the early version of this article.  ... 
doi:10.2174/157489310794072508 fatcat:muzcldjxifc23kl4tynz4lwjlu

An empirical comparison of techniques for the class imbalance problem in churn prediction

Bing Zhu, Bart Baesens, Seppe K.L.M. vanden Broucke
2017 Information Sciences  
Our study offers valuable insights for academics and professionals and it also provides a baseline to develop new methods for dealing with class imbalance in churn prediction.  ...  In this paper, we comprehensively compare the performance of state-of-the-art techniques to deal with class imbalance in the context of churn prediction.  ...  Furthermore there is still room to develop algorithms that can deal with class imbalance and optimize for a profit-based evaluation.  ... 
doi:10.1016/j.ins.2017.04.015 fatcat:hvrxioukwzh2npqvmpicsqlnju

Data Science in Economics [article]

Saeed Nosratabadi, Amir Mosavi, Puhong Duan, Pedram Ghamisi
2020 arXiv   pre-print
The data science advances are investigated in three individual classes of deep learning models, ensemble models, and hybrid models.  ...  On the other hand, it is found that based on the RMSE accuracy metric, hybrid models had higher prediction accuracy than other algorithms.  ...  [69] respectively applied Bagged-pSVM and Boosted-pSVM models, genetic algorithm with the naïve Bayes and SVM, SMOTE-AdaBoost-REP Tree to predict corporate bankruptcy. Ullah et al.  ... 
arXiv:2003.13422v1 fatcat:genllmgl3bhmrhq4txfvlgbpey

Hybrid Metaheuristics to the Automatic Selection of Features and Members of Classifier Ensembles

Antonino Feitosa Neto, Anne Canuto, João Xavier-Junior
2018 Information  
Metaheuristic algorithms have been applied to a wide range of global optimization problems.  ...  Our findings demonstrate a competitive performance of both techniques, in which a hybrid technique provided the lowest error rate for most of the analyzed objective functions.  ...  However, we applied genetic algorithm instead of Memetic algorithm, because Memetic algorithm is a hybrid algorithm itself and genetic algorithm has been widely used in the optimization of ensemble systems  ... 
doi:10.3390/info9110268 fatcat:7fokby2g7jcq7ddsfoj2gpeg7a

Best Suited Machine Learning Techniques for Software Fault Prediction

2020 International journal of recent technology and engineering  
Machine learning techniques for both classification and determination are used for the purpose of software fault prediction.  ...  Boosting mentioned in [2] is another ancient ensemble methodology, and Adaboost is that the foremost well-known of the Boosting family of algorithms that trains models consecutive, with a newest model  ...  The matter of optimal parameter choice and stopping criteria for ensemble size isn't solved. Hansson et al.  ... 
doi:10.35940/ijrte.f9456.038620 fatcat:4bxbol7ux5dwna55q7ep6w5e2y
« Previous Showing results 1 — 15 out of 329 results