A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2017; you can also visit the original URL.
The file type is application/pdf
.
Filters
Boosting for Learning Multiple Classes with Imbalanced Class Distribution
2006
IEEE International Conference on Data Mining. Proceedings
In this paper, we develop a cost-sensitive boosting algorithm to improve the classification performance of imbalanced data involving multiple classes. ...
class distribution and equal misclassification costs. ...
Boosting algorithms change the underlying data distribution and apply the standard classifier learning algorithms to the revised data space iteratively. ...
doi:10.1109/icdm.2006.29
dblp:conf/icdm/SunKW06
fatcat:duhqpxu2hfbjbhhenaj52bdpta
On the Class Imbalance Problem
2008
2008 Fourth International Conference on Natural Computation
In this case, standard machine learning algorithms tend to be overwhelmed by the majority class and ignore the minority class since traditional classifiers seeking an accurate performance over a full range ...
Following surveying evaluation metrics and some other related factors, this paper showed some future directions at last. ...
Thus, it can be used to address class imbalance problem. Rare Another algorithm that uses boosting to address the class imbalance problem is SMOTEBoost [45] . ...
doi:10.1109/icnc.2008.871
dblp:conf/icnc/GuoYDYZ08
fatcat:2uvodh4f5bbf5k5g5h6hl3cjlm
Predicting class-imbalanced business risk using resampling, regularization, and model ensembling algorithms
[article]
2019
arXiv
pre-print
Two ensembling techniques, including Bagging and Boosting, are applied on the DT classifier for further model improvement. ...
We aim at developing and improving the imbalanced business risk modeling via jointly using proper evaluation criteria, resampling, cross-validation, classifier regularization, and ensembling techniques ...
the rare class; and (3) the usage of weak classifiers or classifiers without regularization [11] [12] [13] . ...
arXiv:1903.05535v1
fatcat:sqdduhvydbc6bllcnvya4zgvpi
Transfer learning for class imbalance problems with inadequate data
2015
Knowledge and Information Systems
to improve classification. ...
We propose a novel boosting-based instance transfer classifier with a label-dependent update mechanism that simultaneously compensates for class imbalance and incorporates samples from an auxiliary domain ...
, IIS-1242304, and IIS-1527827. ...
doi:10.1007/s10115-015-0870-3
pmid:27378821
pmcid:PMC4929860
fatcat:qvlty4b4evfohd5fnsxxtk4y7y
RARE CLASS PROBLEM IN DATA MINING: REVIEW
2017
International Journal of Advanced Research in Computer Science
This leads to good overall accuracy but poor minority class detection rate. Many algorithms have been proposed to deal with the imbalanced data problem but each has its prons and corns. ...
Imbalanced dataset means the ratio of positive and negative classes is not balanced. ...
Second approach is algorithmic approach where the especially algorithm is designed to handle the rare class problem. ...
doi:10.26483/ijarcs.v8i7.4530
fatcat:rkjrfhhuxvfddgqqszfnk7wjfu
Peer-to-Peer Multi-class Boosting
[chapter]
2012
Lecture Notes in Computer Science
We evaluate the robustness and the convergence speed of the algorithm empirically over three benchmark databases. ...
We compare the algorithm with the sequential AdaBoost algorithm and we test its performance in a failure scenario involving message drop and delay, and node churn. ...
To achieve this, we proposed a modification of FilterBoost that allows it to learn multi-class models in a purely online fashion, and we proved theoretically that the resulting algorithm optimizes a suitably ...
doi:10.1007/978-3-642-32820-6_39
fatcat:6447hefvuraf5d3bxt7jre4h5y
RUSBoost: A Hybrid Approach to Alleviating Class Imbalance
2010
IEEE transactions on systems, man and cybernetics. Part A. Systems and humans
This algorithm provides a simpler and faster alternative to SMOTEBoost, which is another algorithm that combines boosting and data sampling. ...
Several techniques have been used to alleviate the problem of class imbalance, including data sampling and boosting. ...
ACKNOWLEDGMENT The authors would like to thank the anonymous reviewers and the Associate Editor for the constructive evaluation of this paper and also the various members of the Data Mining and Machine ...
doi:10.1109/tsmca.2009.2029559
fatcat:ulvctf6gkjcqbbulhwpko377wu
Alleviating class imbalance problem in data mining
2013
2013 21st Signal Processing and Communications Applications Conference (SIU)
Boosting is the preferred algorithm when class is imbalanced. Boosting method increases the performance of classification by focusing on examples that are difficult to classify. ...
The second approach is algorithm level, develops new algorithms that can handle class imbalance efficiently to improve the classification performance. ...
doi:10.1109/siu.2013.6531574
dblp:conf/siu/SarmanovaA13
fatcat:haru6aqulfbpfhss46cp2vu24i
Improved Multi-Class Cost-Sensitive Boosting via Estimation of the Minimum-Risk Class
[article]
2016
arXiv
pre-print
Our method jointly optimizes binary weak learners and their corresponding output vectors, requiring classes to share features at each iteration. ...
We evaluate our method on a variety of datasets: a collection of synthetic planar data, common UCI datasets, MNIST digits, SUN scenes, and CUB-200 birds. ...
to the boosted methods to ensure a fair comparison. ...
arXiv:1607.03547v2
fatcat:y5ebtcywyrgtdly3y2mxbydkny
Predicting Rare Classes: Comparing Two-Phase Rule Induction to Cost-Sensitive Boosting
[chapter]
2002
Lecture Notes in Computer Science
Boosting is a strong meta-classifier approach, and has been shown to be adaptable to skewed class distributions. ...
We also show similar supporting results on real-world and benchmark datasets. 1 If a classifier detects m examples to be of class C, out of which l indeed belong to C, then its precision (P ) for class ...
Acknowledgments The contribution to this work by Prof. ...
doi:10.1007/3-540-45681-3_20
fatcat:27wfzwie5nbhrdlicpktqfvggu
A Hierarchical Methodology for Class Detection Problems with Skewed Priors
2005
Journal of Classification
We describe a novel extension to the Class-Cover-Catch-Digraph (CCCD) classifier, specifically tuned to detection problems. ...
Our principal contribution consists of two boosted classifiers built upon the CCCD structure, one in the form of a sequential decision process and the other in the form of a tree. ...
Algorithm 1 shows the steps in the boosted training algorithm. Here, the indexes i and j correspond to stages and sub-stages, respectively. ...
doi:10.1007/s00357-005-0004-9
fatcat:orbivds4ufgffmfyy3lxzmnozu
Boosting methods for multi-class imbalanced data classification: an experimental review
2020
Journal of Big Data
A thorough empirical comparison is conducted to analyze the performance of binary and multi-class boosting algorithms on various multi-class imbalanced datasets. ...
The experimental studies show that the CatBoost and LogitBoost algorithms are superior to other boosting algorithms on multi-class imbalanced conventional and big datasets, respectively. ...
NR: writing, designing the experiment under the supervision of JT and MA as academic supervisors. All authors read and approved the final manuscript.
Funding Not applicable. ...
doi:10.1186/s40537-020-00349-y
fatcat:v6yipwuipzeynmj4obwlkypuha
Handling class imbalance in customer behavior prediction
2014
2014 International Conference on Collaboration Technologies and Systems (CTS)
Using a more appropriate evaluation metric (AUC), we investigated the increase of performance for under-sampling and two machine learning algorithms (weight Random Forests and RUSBoost) against a benchmark ...
RUSBoost, as a specific algorithm designed to deal with class imbalance problem, is also effective but not as good as under-sampling. ...
The effectiveness of handling class imbalance behind RUSBoost is that it combines two traditional methods, sampling and boosting algorithm, which are all effective to handle imbalance problems. ...
doi:10.1109/cts.2014.6867549
dblp:conf/cts/LiuWAA14
fatcat:siijogd6dzdxtem3l7fmscva4y
Feature Selection and Ensemble Learning Techniques in One-Class Classifiers: An Empirical Study of Two-Class Imbalanced Datasets
2021
IEEE Access
Most solutions including data levels, algorithm levels, and cost sensitive approaches are derived using multi-class classifiers, depending on the number of classes to be classified. ...
data in the majority class and ensemble learning is employed to combine multiple OCC classifiers. ...
Sundar and Punniyamoorthy make two major modications to Wang's Boosted SVM (WBSVM) algorithm to improve the classification performance without increasing its existing time complexity [13] . ...
doi:10.1109/access.2021.3051969
fatcat:43crcel5rbbvnfadjbyypdefdy
Imbalanced Class Learning in Epigenetics
2014
Journal of Computational Biology
Datasets with a large ratio between minority and majority classes face hindrance in learning using any classifier. ...
For this class imbalance problem, a number of algorithms are compared, including the TAN + AdaBoost algorithm. ...
As EasyEnsemble uses boosting inside of each bag (in bagging), this combination (boosting reduces bias and bagging reduces variance) of boosting and bagging improves performance. ...
doi:10.1089/cmb.2014.0008
pmid:24798423
pmcid:PMC4082351
fatcat:m6yuruvgmnbm5jkhugqa7pbgbu
« Previous
Showing results 1 — 15 out of 22,255 results