A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Filters
Sparsity-driven weighted ensemble classifier
[article]
2016
arXiv
pre-print
In this letter, a novel weighted ensemble classifier is proposed that improves classification accuracy and minimizes the number of classifiers. ...
Ensemble weight finding problem is modeled as a cost function with following terms: (a) a data fidelity term aiming to decrease misclassification rate, (b) a sparsity term aiming to decrease the number ...
Conclusion In this article, a novel sparsity driven ensemble classifier method has been presented. ...
arXiv:1610.00270v2
fatcat:ehutcov5cfgqbk53ow7lhw77z4
Sparsity-driven weighted ensemble classifier
2018
International Journal of Computational Intelligence Systems
In this study, a novel sparsity-driven weighted ensemble classifier (SDWEC) that improves classification accuracy and minimizes the number of classifiers is proposed. ...
The efficiency of SDWEC was tested on 11 datasets and compared with the state-of-the art classifier ensemble methods. ...
Conclusion In this article, a novel sparsity driven ensemble classifier method, SDWEC, has been presented. ...
doi:10.2991/ijcis.11.1.73
fatcat:q63y2wjadzedlm2wn7peoasry4
Sparse Ensemble Learning for Concept Detection
2012
IEEE transactions on multimedia
This work presents a novel sparse ensemble learning scheme for concept detection in videos. ...
The resultant ensemble model is, therefore, sparse, in the way that only a small number of efficient classifiers in the ensemble will fire on a testing sample. ...
To acquire diversity of classifiers in the ensemble, we proposed a novel space-based instance partitioning scheme via sparse NMF. ...
doi:10.1109/tmm.2011.2168198
fatcat:yg5dvk75qvgilgsxvdrslblguu
An Enhanced Ensemble Approach for Non-Intrusive Energy Use Monitoring Based on Multidimensional Heterogeneity
2021
Sensors
Then, a sparse coding model is allocated for individual classifiers, and the combined classifier is diversified by introducing different distance and similarity measures without consideration of sparsity ...
Firstly, the idea of utilizing a heterogeneous design is presented, and the corresponding ensemble framework for load disaggregation is established. ...
Acknowledgments: The authors sincerely thank Tiancheng Eric Song from Southeast University for reviewing and improving the English of this article. ...
doi:10.3390/s21227750
pmid:34833825
pmcid:PMC8618865
fatcat:sppgjkjj5jh33j753sq3b6ccpu
Max-Margin Stacking and Sparse Regularization for Linear Classifier Combination and Selection
[article]
2011
arXiv
pre-print
The main principle of stacked generalization (or Stacking) is using a second-level generalizer to combine the outputs of base classifiers in an ensemble. ...
With the non-diverse ensembles, we even gain accuracy on average by using sparse regularization. ...
In the second ensemble setup, we trained a total of 154 SVM's with different kernel functions and parameters. Latter method produces less diverse base classifiers with respect to the former one. ...
arXiv:1106.1684v1
fatcat:nlhcz5m3ejdnxotat25ov7s6qu
Pruning of Error Correcting Output Codes by optimization of accuracy–diversity trade off
2014
Machine Learning
In this paper, we propose a novel approach for pruning the ECOC matrix by utilizing accuracy and diversity information simultaneously. ...
Ensemble learning is a method of combining learners to obtain more reliable and accurate predictions in supervised and unsupervised learning. ...
We thank Atabey Kaygun for assistance with SDP application, and anonymous reviewers for comments that greatly improved the manuscript. ...
doi:10.1007/s10994-014-5477-5
fatcat:tdbuge7lknactaexfrnbx75doq
Deep Ensembling with No Overhead for either Training or Testing: The All-Round Blessings of Dynamic Sparsity
[article]
2022
arXiv
pre-print
Despite being an ensemble method, FreeTickets has even fewer parameters and training FLOPs than a single dense model. ...
In this work, we draw a unique connection between sparse neural network training and deep ensembles, yielding a novel efficient ensemble learning framework called FreeTickets. ...
We would like to thank Bram Grooten for giving feedback on the camera-ready version; the professional and conscientious reviewers of ICLR for providing valuable comments. Z. ...
arXiv:2106.14568v4
fatcat:3tirppshfvcqtkawka2rhw5zim
Prune2Edge: A Multi-Phase Pruning Pipelines to Deep Ensemble Learning in IIoT
[article]
2021
arXiv
pre-print
In this paper, we propose a novel edge-based multi-phase pruning pipelines to ensemble learning on IIoT devices. ...
In the first phase, we generate a diverse ensemble of pruned models, then we apply integer quantisation, next we prune the generated ensemble using a clustering-based technique. ...
Experimentally, on CIFAR-10, CIFAR-100 our proposed method was able to produce classifiers with higher accuracy levels with up to 90% reduction in model size. ...
arXiv:2004.04710v2
fatcat:dlxzje7gqnagpg3cevw4t2t3mi
Ensemble sparse classification of Alzheimer's disease
2012
NeuroImage
In this paper, instead of building a single global classifier, we propose a local patchbased subspace ensemble method which builds multiple individual classifiers based on different subsets of local patches ...
classification, respectively, demonstrating a very promising performance of our method compared with the state-of-the-art methods for AD/MCI classification using MR images. ...
Hoffman-La Roche, Schering-Plough, Synarc, Inc., as well as non-profit partners the Alzheimer's Association and Alzheimer's Drug Discovery Foundation, with participation from the U.S. ...
doi:10.1016/j.neuroimage.2012.01.055
pmid:22270352
pmcid:PMC3303950
fatcat:wyskvqciezbtrcn5a5lc3udfzi
Divide, Denoise, and Defend against Adversarial Attacks
[article]
2019
arXiv
pre-print
We present an analysis of the tradeoff between accuracy and robustness against adversarial attacks. We evaluate our method under black-box, grey-box, and white-box settings. ...
On the ImageNet dataset, our method outperforms the state-of-the-art by 19.7% under grey-box setting, and performs comparably under black-box setting. ...
We use a novel patch selection algorithm that is optimized to improve robustness of the classifier. ...
arXiv:1802.06806v2
fatcat:kow6q2mphjck5p3tltyf6c2r7a
Feature Representation Using Deep Autoencoder for Lung Nodule Image Classification
2018
Complexity
Experimental results with regard to different parameter setting, data augmentation, model sparsity, classifier algorithms, and model ensemble validate the effectiveness of our proposed approach. ...
In this work, we propose a novel model for lung nodule image feature representation that incorporates both local and global characters. ...
The authors gratefully acknowledge the support of NVIDIA Corporation with the donation of GPU used for this research. ...
doi:10.1155/2018/3078374
fatcat:24pjgn2kl5ewlkndouprcrrnza
Exclusivity Regularized Machine: A New Ensemble SVM Classifier
2017
Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence
The diversity of base learners is of utmost importance to a good ensemble. This paper defines a novel measurement of diversity, termed as exclusivity. ...
With the designed exclusivity, we further propose an ensemble SVM classifier, namely Exclusivity Regularized Machine (ExRM), to jointly suppress the training error of ensemble and enhance the diversity ...
Haibin Ling is supported by National Natural Science Foundation of China (grant no. 61528204) and National Science Foundation (grant no. 1350521). ...
doi:10.24963/ijcai.2017/241
dblp:conf/ijcai/GuoWL17
fatcat:sstxcdvfbrbzzmtq24fppsi3bq
Data-Driven Diverse Logistic Regression Ensembles
[article]
2021
arXiv
pre-print
Measures of diversity in classifier ensembles are used to show how our method learns the ensemble by exploiting the accuracy-diversity trade-off for ensemble models. ...
A novel framework for statistical learning is introduced which combines ideas from regularization and ensembling. ...
To select the tuning parameters we alternate between a grid search for the sparsity penalty and a grid search for the diversity penalty, such that the cross-validated loss of the ensemble classifier is ...
arXiv:2102.08591v4
fatcat:zz3u5olgznhhljqy5gaanw3ole
Learning to Diversify via Weighted Kernels for Classifier Ensemble
[article]
2014
arXiv
pre-print
In this paper, we argue that diversity, not direct diversity on samples but adaptive diversity with data, is highly correlated to ensemble accuracy, and we propose a novel technology for classifier ensemble ...
Classifier ensemble generally should combine diverse component classifiers. However, it is difficult to give a definitive connection between diversity measure and ensemble accuracy. ...
Consequently, we expand the ensemble diversity via weighted kernels, and propose a novel ensemble method, learning to diversity via weighted kernels. ...
arXiv:1406.1167v1
fatcat:tvaqkcoefbc2dajcwfs5rxobt4
An Incremental Learning Method Based on Dynamic Ensemble RVM for Intrusion Detection
2021
IEEE Transactions on Network and Service Management
Therefore, we propose an intrusion detection method of dynamic ensemble incremental learning (DEIL-RVM), and realize a dynamically adjusted ensemble intrusion detection model. ...
The RVM with high sparsity is used as the base component to obtain the good balance between the accuracy, robustness and resource consumption, which can sacrifice less time and storage cost in ensemble ...
ACKNOWLEDGMENT This work was supported in part by the joint funds of National Natural Science Foundation of China and Civil Aviation Administration of China (U1933108), and the Fundamental Research Funds ...
doi:10.1109/tnsm.2021.3102388
fatcat:t42lk6cyvvbjverj72kietduqy
« Previous
Showing results 1 — 15 out of 2,906 results