Filters








45,109 Hits in 3.1 sec

Diversity and Generalization in Neural Network Ensembles [article]

Luis A. Ortega, Rafael Cabañas, Andrés R. Masegosa
2022 arXiv   pre-print
But the exact role that diversity plays in ensemble models is poorly understood, specially in the context of neural networks.  ...  More precisely, we provide sound answers to the following questions: how to measure diversity, how diversity relates to the generalization error of an ensemble, and how diversity is promoted by neural  ...  Finally we would like to thank the "María Zambrano" grant (RR C 2021 01) from the Spanish Ministry of Universities and funded with NextGenerationEU funds.  ... 
arXiv:2110.13786v2 fatcat:7ww5hudhjzcypc3z3nebhostgq

Evolutionary random neural ensembles based on negative correlation learning

Huanhuan Chen, Xin Yao
2007 2007 IEEE Congress on Evolutionary Computation  
The idea of generating ensemble by simultaneous randomization of data and feature is to promote the diversity within the ensemble and encourage different individual NNs in the ensemble to learn different  ...  The algorithm utilizes both bootstrap of training data and random feature subspace techniques to generate an initial and diverse ensemble and evolves the ensemble with negative correlation learning.  ...  In general, the individual classifiers in ensemble are designed to be accurate and diverse.  ... 
doi:10.1109/cec.2007.4424645 dblp:conf/cec/ChenY07 fatcat:fqexqjdt7vcujgrlf65vq7hsse

A Novel Nonlinear Neural Network Ensemble Model for Financial Time Series Forecasting [chapter]

Kin Keung Lai, Lean Yu, Shouyang Wang, Huang Wei
2006 Lecture Notes in Computer Science  
In this study, a new nonlinear neural network ensemble model is proposed for financial time series forecasting. In this model, many different neural network models are first generated.  ...  Then the principal component analysis technique is used to select the appropriate ensemble members. Finally, the support vector machine regression method is used for neural network ensemble.  ...  (1) in all the ensemble methods the SVMRbased ensemble model performs the best, followed by the neural network based ensemble method and other three linear ensemble method from a general view and (  ... 
doi:10.1007/11758501_106 fatcat:ewt6oiivkfel3kvbuohk4nlmiq

Relationship between Diversity and Perfomance of Multiple Classifiers for Decision Support [article]

R. Musehane, F. Netshiongolwe, F.V. Nelwamondo, L. Masisi, T. Marwala
2008 arXiv   pre-print
The parameters of the neural network within the committee were varied to induce diversity; hence structural diversity is the focus for this study.  ...  It is observed that the classification accuracy of an ensemble increases as the diversity increases. There was an increase of 3%-6% in the classification accuracy.  ...  of neural networks in an ensemble = total number of different neural networks/species = the diversity index The diversity ranges from 0 to 1, where 0 indicates low diversity and 1 indicates highest diversity  ... 
arXiv:0810.3865v1 fatcat:aatpiom3s5d63lu4otoycsrysi

Efficient Diversity-Driven Ensemble for Deep Neural Networks [article]

Wentao Zhang, Jiawei Jiang, Yingxia Shao, Bin Cui
2021 arXiv   pre-print
The ensemble of deep neural networks has been shown, both theoretically and empirically, to improve generalization accuracy on the unseen test set.  ...  Compared with other well-known ensemble methods, EDDE can get highest ensemble accuracy with the lowest training cost, which means it is efficient in the ensemble of neural networks.  ...  CONCLUSION Ensemble learning is useful in improving the generalization ability of deep neural networks.  ... 
arXiv:2112.13316v1 fatcat:lwwlabm6q5erbggtgxhx2smvpi

Neural Network Ensembles: Theory, Training, and the Importance of Explicit Diversity [article]

Wenjing Li, Randy C. Paffenroth, David Berthiaume
2021 arXiv   pre-print
the subtle trade-off between accuracy and diversity in an ensemble.  ...  We also propose a training algorithm for neural network ensembles and demonstrate that our approach provides improved performance when compared to both state-of-the-art individual learners and ensembles  ...  Our neural network ensemble (in orange) shows an overall higher diversity and higher accuracy than DECORATE neural network ensemble (in blue).  ... 
arXiv:2109.14117v1 fatcat:ta3msv3r5rf5vc22rkeipkdgb4

Active Diverse Learning Neural Network Ensemble Approach for Power Transformer Fault Diagnosis

Yu Xu, Dongbo Zhang, Yaonan Wang
2010 Journal of Networks  
An ensemble learning algorithm was proposed in this paper by analyzing the error function of neural network ensembles, by which, individual neural networks were actively guided to learn diversity.  ...  And all the individual networks in the ensemble were leaded to learn diversity through cooperative training.  ...  CONCLUSIONS The key factor of generalization ability improvement lies in the diversity of the individual network in ensemble.  ... 
doi:10.4304/jnw.5.10.1151-1159 fatcat:vlqcsuqyrvfdbbfqubpbi2b2tq

Credit risk assessment with a multistage neural network ensemble learning approach

L YU, S WANG, K LAI
2008 Expert systems with applications  
In the third stage, the generated neural network models are trained with different training datasets and accordingly the classification score and reliability value of neural classifier can be obtained.  ...  In the final stage, the selected neural network ensemble members are fused to obtain final classification result by means of reliability measurement.  ...  Acknowledgements The authors would like to thank the Editor-in-Chief and reviewers for their recommendation and comments. This  ... 
doi:10.1016/j.eswa.2007.01.009 fatcat:dimmdsso6rejxcjmpp7oliwwoe

Bagging of Complementary Neural Networks with Double Dynamic Weight Averaging

Sathit Nakkrasae, Pawalai Kraipeerapun, Somkid Amornsamankul, Chun Che Fung
2010 2010 11th ACIS International Conference on Software Engineering, Artificial Intelligence, Networking and Parallel/Distributed Computing  
In order to enhance the diversity in the ensemble, different training datasets created based on bagging technique are applied to an ensemble of pairs of feed-forward back-propagation neural networks created  ...  Ensemble technique has been widely applied in regression problems. This paper proposes a novel approach of the ensemble of Complementary Neural Network (CMTNN) using double dynamic weight averaging.  ...  The same generated training data are used in both truth and falsity neural networks in each component.  ... 
doi:10.1109/snpd.2010.34 dblp:conf/snpd/NakkrasaeKAF10 fatcat:fcm7qsfgtfg7xa2wtadvd7ziha

Evolutionary Ensemble for In Silico Prediction of Ames Test Mutagenicity [chapter]

Huanhuan Chen, Xin Yao
2007 Lecture Notes in Computer Science  
Evolutionary random neural ensemble with negative correlation learning (ERNE) [1] was developed based on neural networks and evolutionary algorithms.  ...  Therefore, the resulting individuals in the final ensemble are capable of cooperating collectively to achieve better generalization of prediction.  ...  Since ERNE employs bootstrap sampling and random subspace method to generate the initialized neural ensemble and maintains the randomization in the evolving stage, diversity in the ensemble is encouraged  ... 
doi:10.1007/978-3-540-74205-0_120 fatcat:xkj52l7wgzhujcyg6s4f3nrqqa

DIVACE: Diverse and Accurate Ensemble Learning Algorithm [chapter]

Arjun Chandra, Xin Yao
2004 Lecture Notes in Computer Science  
In order for a neural network ensemble to generalise properly, two factors are considered vital. One is the diversity and the other is the accuracy of the networks that comprise the ensemble.  ...  There exists a tradeoff as to what should be the optimal measures of diversity and accuracy. The aim of this paper is to address this issue.  ...  [5] gives a good account of why diversity is necessary in neural network ensembles and presents a taxonomy of methods that enforce it in practice.  ... 
doi:10.1007/978-3-540-28651-6_91 fatcat:pgyjzkg3vvd23n5eo3zyqteo3u

Designing classifier ensembles with constrained performance requirements

Weizhong Yan, Kai F. Goebel, Belur V. Dasarathy
2004 Multisensor, Multisource Information Fusion: Architectures, Algorithms, and Applications 2004  
Specifically, we present a design strategy for designing neural network ensembles to satisfy constrained performance requirements, which is illustrated by designing a real-world classification problem.  ...  Classifier ensembles are one of the most significant advances in pattern recognition/classification in recent years and have been actively studied by many researchers.  ...  Section 2 provides an overview of classifier ensembles in general and neural network ensembles in particular. Section 3 formulates the constrained performance requirements.  ... 
doi:10.1117/12.542616 fatcat:e2bqpjbxjrcxvpidr4vny672nq

Augmenting Novelty Search with a Surrogate Model to Engineer Meta-Diversity in Ensembles of Classifiers [article]

Rui P. Cardoso, Emma Hart, David Burth Kurka, Jeremy V. Pitt
2022 arXiv   pre-print
in Novelty Search.  ...  Using Neuroevolution combined with Novelty Search to promote behavioural diversity is capable of constructing high-performing ensembles for classification.  ...  neural network in each generation.  ... 
arXiv:2201.12896v3 fatcat:cprkgjw5bvappogdp52umxzm5i

Negative correlation in incremental learning

Fernanda Li Minku, Hirotaka Inoue, Xin Yao
2007 Natural Computing  
It encourages the neural networks that compose the ensemble to be different from each other and, at the same time, accurate.  ...  The difference among the neural networks that compose an ensemble is a desirable feature to perform incremental learning, for some of the neural networks can be able to adapt faster and better to new data  ...  Acknowledgements The first author would like to thank the United Kingdom Government and the School of Computer Science of the University of Birmingham for the financial support in the form of an Overseas  ... 
doi:10.1007/s11047-007-9063-7 fatcat:rgooeh6b75dctbip44alt6g654

Diversity regularization in deep ensembles [article]

Changjian Shui, Azadeh Sadat Mozafari, Jonathan Marek, Ihsen Hedhli, Christian Gagné
2018 arXiv   pre-print
However, it has been reported that deep neural network models are often too poorly calibrated for achieving complex tasks requiring reliable uncertainty estimates in their prediction.  ...  In this work, we are proposing a strategy for training deep ensembles with a diversity function regularization, which improves the calibration property while maintaining a similar prediction accuracy.  ...  However, this proposal was analyzed in the perspective of improving generalization of neural networks ensemble, ignoring the calibration properties.  ... 
arXiv:1802.07881v1 fatcat:evbca7ruu5d7daiqqo44dojufa
« Previous Showing results 1 — 15 out of 45,109 results