Filters








12,327 Hits in 6.6 sec

Graph Ensemble Boosting for Imbalanced Noisy Graph Stream Classification

Shirui Pan, Jia Wu, Xingquan Zhu, Chengqi Zhang
2015 IEEE Transactions on Cybernetics  
Our method, graph ensemble boosting (gEBoost), employs an ensemble based framework to partition graph stream into chunks each containing a number of noisy graphs with imbalanced class distributions.  ...  To tackle the concept drifting in graph streams, an instance level weighting mechanism is used to dynamically adjust the instance weight, through which the boosting framework can emphasize on difficult  ...  Overall Framework In this paper, we propose an ensemble classification framework, with a linear boosting procedure in each chunk to select discriminative subgraph features and train ensemble based classifiers  ... 
doi:10.1109/tcyb.2014.2341031 pmid:25167562 fatcat:a3eghn7aqfg7vbkj77nq72qpym

Dynamic Weighting Ensembles for Incremental Learning

Xinzhu Yang, Bo Yuan, Wenhuang Liu
2009 2009 Chinese Conference on Pattern Recognition  
Experimental results show that the proposed dynamic weighting scheme can achieve better performance compared to the fixed weighting scheme on a variety of standard UCI benchmark datasets.  ...  This paper investigates an interesting question of solving incremental learning problems using ensemble algorithms.  ...  ALGORITHM FRAMEWORK This section introduces an algorithm framework for using dynamic weighting ensembles to effectively learn new batches of data appearing over time without the need of retraining.  ... 
doi:10.1109/ccpr.2009.5344129 fatcat:skcgz6eoznef3ltww7nxij4suy

Synergy of physics-based reasoning and machine learning in biomedical applications: towards unlimited deep learning with limited data

Valeriy Gavrishchaka, Olga Senyukova, Mark Koepke
2019 Advances in Physics: X  
significant data incompleteness, and boosting accuracy of low-complexity models within the classifier ensemble, as illustrated in physiological-data analysis.  ...  We outline our hybrid framework that leverages existing domain-expert models/knowledge, boosting-like model combination, DNN-based DL and other machine learning algorithms for drastic reduction of training-data  ...  Disclosure statement Authors claim no potential conflict of interest exists.  ... 
doi:10.1080/23746149.2019.1582361 fatcat:wkmef4jmgreurnseofsaqa5dva

Stochastic Embedded Probit Regressive Reweight Boost Classifier for Software Quality Examination

2019 International journal of recent technology and engineering  
In software development, Software quality analysis plays a considerable process. Through the software testing, the quality analysis is performed for efficient prediction of defects in the code.  ...  With the assist of Pearson correlative probit regressed reweight boost technique, the classification of program files is performed. The boosting algorithm creates 'm' number of weak classifiers i.e.  ...  But the framework failed to perform the feature selection.  ... 
doi:10.35940/ijrte.c1040.1183s319 fatcat:liqdj52u2zb4zgpxsmokxkhuxa

Facial Expression Recognition Using Spatiotemporal Boosted Discriminatory Classifiers [chapter]

Stephen Moore, Eng Jon Ong, Richard Bowden
2010 Lecture Notes in Computer Science  
Detection is efficient as weak classifiers are evaluated using an efficient look up to a chamfer image. An ensemble framework is presented with all-pairs binary classifiers.  ...  The results of this research is a 6 class classifier (joy, surprise, fear, sadness, anger and disgust ) with recognition results of up to 95%.  ...  Acknowledgement This work has been supported by the EPSRC project LILiR and by the FP7 project DICTASIGN (FP7/2007-2013) under grant agreement n 231135.  ... 
doi:10.1007/978-3-642-13772-3_41 fatcat:mjptxtpz6vcmtg6kglwgtsfmmy

Boosting Kernel Models for Regression

Ping Sun, Xin Yao
2006 IEEE International Conference on Data Mining. Proceedings  
This paper proposes a general boosting framework for combining multiple kernel models in the context of both classification and regression problems.  ...  We focus mainly on using the proposed boosting framework to combine kernel ridge regression (KRR) models for regression tasks.  ...  Kin40k: This dataset represents the forward dynamics of an 8 link all-revolute robot arm, the task is to predict the distance of the end-effector from a target, given the twist angles of the 8 links as  ... 
doi:10.1109/icdm.2006.30 dblp:conf/icdm/SunY06 fatcat:gglqwzssqvdutmrcavvd3zyy2a

Ensemble learning for free with evolutionary algorithms?

Christian Gagné, Michèle Sebag, Marc Schoenauer, Marco Tomassini
2007 Proceedings of the 9th annual conference on Genetic and evolutionary computation - GECCO '07  
as a pool for building classifier ensembles.  ...  This paper has examined the "Evolutionary Ensemble Learning for Free" claim, based on the fact that, since Evolutionary Algorithms maintain a population of solutions, it comes naturally to use these populations  ...  , small subsets are selected from the current fold to compute the fitness function, where the selection is nicely based on a mixture of uniform and Boosting-like distributions.  ... 
doi:10.1145/1276958.1277317 dblp:conf/gecco/GagneSST07 fatcat:fdk6ccpvovbrrb4kkyczurcw24

Active Collaborative Ensemble Tracking [article]

Kourosh Meshgi, Maryam Sadat Mirzaei, Shigeyuki Oba, Shin Ishii
2017 arXiv   pre-print
A discriminative ensemble tracker employs multiple classifiers, each of which casts a vote on all of the obtained samples. The votes are then aggregated in an attempt to localize the target object.  ...  ' dynamics.  ...  The weights of the classifier is calculated based on its agreement with the whole ensemble.  ... 
arXiv:1704.08821v1 fatcat:2wnwmjmjpnhypgvgpj5nj2ogju

Improving Drug Sensitivity Prediction Using Different Types of Data

HA Hejase, C Chan
2015 CPT: Pharmacometrics & Systems Pharmacology  
In subchallenge 2, a weighted Euclidean distance method is introduced to predict and rank the drug combinations from the most to the least effective in reducing the viability of a diffuse large B-cell  ...  In subchallenge 1, a bidirectional search algorithm is introduced and optimized using an ensemble scheme and a nonlinear support vector machine (SVM) is then applied to predict the effects of the drug  ...  This study was supported in part by the National Institutes of Health (R01GM079688, R01GM089866, and R21CA176854) and the National Science Foundation (CBET 0941055).  ... 
doi:10.1002/psp4.2 pmid:26225231 pmcid:PMC4360670 fatcat:ail5vu463rdntp6zm3l3cdm7ca

Ensemble Learning for Free with Evolutionary Algorithms ? [article]

Christian Gagné, Michèle Sebag (INRIA Futurs), Marc Schoenauer
2007 arXiv   pre-print
In the meanwhile, Ensemble Learning, one of the most efficient approaches in supervised Machine Learning for the last decade, proceeds by building a population of diverse classifiers.  ...  First, a new fitness function, inspired by co-evolution and enforcing the classifier diversity, is presented. Further, a new selection criterion based on the classification margin is proposed.  ...  The second and third authors gratefully acknowledge the support of the Pascal Network of Excellence IST-2002-506 778.  ... 
arXiv:0704.3905v1 fatcat:4sbd6bu4xbbnjbaaqkozfjp7ka

Robust Framework to Combine Diverse Classifiers Assigning Distributed Confidence to Individual Classifiers at Class Level

Shehzad Khalid, Sannia Arshad, Sohail Jabbar, Seungmin Rho
2014 The Scientific World Journal  
A weight learning method is then introduced to learn weights on each class for different classifiers to construct an ensemble.  ...  We have presented a classification framework that combines multiple heterogeneous classifiers in the presence of class label noise.  ...  The Scientific World Journal Table 9 : Confusion matrices of (a) Adaboost, (b) Bagging, (c) RSM, and (d) proposed approach using Heart dataset.  ... 
doi:10.1155/2014/492387 pmid:25295302 pmcid:PMC4177094 fatcat:ahwjct6eu5auhp7qxps7p5dvuy

DCSO: Dynamic Combination of Detector Scores for Outlier Ensembles [article]

Yue Zhao, Maciej K. Hryniewicki
2019 arXiv   pre-print
In this paper, an unsupervised outlier detector combination framework called DCSO is proposed, demonstrated and assessed for the dynamic selection of most competent base detectors, with an emphasis on  ...  Selecting and combining the outlier scores of different base detectors used within outlier ensembles can be quite challenging in the absence of ground truth.  ...  CONCLUSIONS A new and improved unsupervised framework called DCSO (Dynamic Combination of Detector Scores for Outlier Ensembles) is proposed and assessed in the selection and combination of base outlier  ... 
arXiv:1911.10418v1 fatcat:polqgqik65cj7eiu3k7uzr6od4

Margin Boost Clustering based Multivariate Dolphin Swarm Optimization for Routing and Reliable Data Dissemination in VANET

2019 VOLUME-8 ISSUE-10, AUGUST 2019, REGULAR ISSUE  
A mean-shift margin boost clustering is an ensemble clustering technique to divide the total network into a number of groups.  ...  Multivariate Dolphin Swarm Optimized Routing (MDSOR) is the cluster based optimization to select the optimal cluster head based on fitness function in terms of distance, signal strength and bandwidth.  ...  Based on the error value, the initial weight is updated. The boosting classifier selects the weak learner with minimum training error.  ... 
doi:10.35940/ijitee.k1780.0981119 fatcat:vchejhr5zvbdtcpprvj6rolaiy

Intelligent Decision Support System for Predicting Student's E-Learning Performance Using Ensemble Machine Learning

Farrukh Saleem, Zahid Ullah, Bahjat Fakieh, Faris Kateb
2021 Mathematics  
The dataset chosen in this study belongs to one of the learning management systems providing a number of features predicting student's performance.  ...  The integration of the ML models has improved the prediction ratio and performed better than all other ensemble approaches.  ...  The classification of a particular example to the nearest training example is based on distance measurement [53] .  ... 
doi:10.3390/math9172078 fatcat:li5ckuchrrah3pnyfp5kxaw4gm

COGNITIVE AND BEHAVIORAL MODEL ENSEMBLES FOR AUTONOMOUS VIRTUAL CHARACTERS

Jeffrey S. Whiting, Jonathan Dinerstein, Parris K. Egbert, Dan Ventura
2010 Computational intelligence  
Acknowledgements xi List of Tables xvii  ...  Action Selection Once the ensemble controller has the action and weight tuple (a i , w i ) for each model it chooses an action based on the weighting. There are several ways this can be done.  ...  B 2 A V-formation model that dynamically forms flocks in the shape of a V [Gervasi and Prencipe 2004] . B 3 A stunt model that can perform a number of different scripted stunts.  ... 
doi:10.1111/j.1467-8640.2009.00345.x fatcat:sn7jlprvyrdoto6642b77tf7ba
« Previous Showing results 1 — 15 out of 12,327 results