44,622 Hits in 3.5 sec

Toward global optimization of neural networks: A comparison of the genetic algorithm and backpropagation

Randall S. Sexton, Robert E. Dorsey, John D. Johnson
1998 Decision Support Systems  
The value of using the genetic algorithm over backpropagation for neural network optimization is illustrated through a Monte Carlo study which compares each algorithm on in-sample, interpolation, and extrapolation  ...  The vast majority of these studies rely on a gradient algorithm, typically a variation of back propagation, to obtain the parameters (weights) of the model.  ...  Since the GA was implemented on a CRAY-YMP, precision and operation time could affect the comparison with the PC-based Neural Works, so a backpropagation algorithm written and optimized for a CRAY-YMP  ... 
doi:10.1016/s0167-9236(97)00040-7 fatcat:4snnk3uxfvar7ddlhjv5xevh6e

Comparison Of Particle Swarm Optimization And Backpropagation Algorithms For Training Feedforward Neural Network

Nasser Mohammadi, Seyed Javad Mirabedini
2014 Journal of Mathematics and Computer Science  
Generally Backpropagation (BP) algorithm is used to train the neural network.  ...  In this paper, to improve the performance of ANN, the adjustment of network weights using Particle Swarm Optimization (PSO) was proposed as a mechanism and the results obtained were compared with various  ...  In comparison with other Meta heuristics, PSO has obtained popularity and showed clearly to be an effective and competitive optimization algorithm.  ... 
doi:10.22436/jmcs.012.02.03 fatcat:lbziih44trhmhhvaxwdue3q5ym

Comparison of particle swarm optimization and backpropagation as training algorithms for neural networks

V.G. Gudise, G.K. Venayagamoorthy
Proceedings of the 2003 IEEE Swarm Intelligence Symposium. SIS'03 (Cat. No.03EX706)  
a n d g~~, n , r r~i r . r . o Abstract -Particle swarm optimization (PSO) motivated by the social behavior of organisms, is a step up to existing evolutionary algorithms (or optimization of continuous  ...  Backpropagation (BP) is generally used for neural network training. Choosing a proper algorithm for training a neural network is very important.  ...  CONCLUSIONS A feedforward neural network leaming a nonlinear function with the backpropagation and particle swarm optimization algorithms have been presented in this paper.  ... 
doi:10.1109/sis.2003.1202255 dblp:conf/swis/GudiseV03 fatcat:djtbyxebizdhhfpgvaawtjitni

Optimization of The Backpropagation Method with Nguyen-widrow in Face Image Classification

Ichsanuddin Hakim, Syahril Efendi, Pahala Sirait
2021 Randwick International of Social Science Journal  
In this study, it is proven that the Nguyen-widrow algorithm can optimize the Backpropagation method in terms of initializing weights and bias.  ...  In the testing process with hidden layer 6 neurons, at a target error of 0.01, the standard Backpropagation method obtained an accuracy of 96%, while the optimization Backpropagation method obtained a  ...  In Figure 4 it can be seen a comparison of the accuracy level of the test results of the standard Backpropagation method with Backpropagation optimization (with Nguyenwidrow).  ... 
doi:10.47175/rissj.v2i2.226 fatcat:bujqlnd54jhz5jwlwmokfwp2b4

Rainfall prediction using backpropagation algorithm optimized by Broyden-Fletcher-Goldfarb-Shanno algorithm

S Anam
2019 IOP Conference Series: Materials Science and Engineering  
However, the performance of the backpropagation algorithm depends on the architecture and the optimization method used.  ...  Backpropagation algorithm is one of the ANN which has been successfully used in various fields.  ...  The backpropagation algorithm optimized by BFGS algorithm results in the best MSE (Mean Square Target or Network Output Comparison of the Target and Network Output Comparison of the Target and Network  ... 
doi:10.1088/1757-899x/567/1/012008 fatcat:ck6c7upvkjfrhn75itkg3ryif4

Neural Network Training by Parameter Optimization Approach [chapter]

Ö. Ciftcioglu, E. Türkcan
1993 ICANN '93  
The comparison of the results with those obtained by standard backpropagation algorithm indicated that the approach to the global minimum is much improved by the optimization algorithms.  ...  Signals positions are indicated in the schematic representation of the power plant (Fig. 1) . The optimization algorithms used are quasi-Newton. congugate gradient and standard backpropagation.  ...  The comparison of the results with those obtained by standard backpropagation algorithm indicated that the approach to the global minimum is much improved by the optimization algorithms.  ... 
doi:10.1007/978-1-4471-2063-6_254 fatcat:e4bb63u2s5dytkjgml26nij5uy

Training Deep Neural Networks with Constrained Learning Parameters [article]

Prasanna Date, Christopher D. Carothers, John E. Mitchell, James A. Hendler, Malik Magdon-Ismail
2020 arXiv   pre-print
Backpropagation.  ...  We use following performance metrics for the comparison: (i) Training error; (ii) Validation error; (iii) Memory usage; and (iv) Training time.  ...  Fig. 7 : 7 Comparison of Backpropagation and CoNNTrA Algorithm 1: Discretization Subroutine for CoNNTrA 1 Function Discretize(W pre , ω): Input: W pre : Pretrained Weights ω: Set of Finite Discrete Values  ... 
arXiv:2009.00540v1 fatcat:cx6gpxuqazfejhviud4sfxap3a

Momentum Backpropagation Optimization for Cancer Detection Based on DNA Microarray Data

Untari Novia Wisesty, Febryanti Sthevanie, Rita Rismala
2021 International Journal of Artificial Intelligence Research  
Therefore, in this research an optimization of the Momentum Backpropagation algorithm is done by adding an adaptive learning rate scheme.  ...  The proposed scheme is proven to reduce the number of epochs needed in the training process from 390 epochs to 76 epochs compared to the Momentum Backpropagation algorithm.  ...  Number of Epochs Comparison of Optimization Algorithms in Cancer Detection Systems.  ... 
doi:10.29099/ijair.v4i2.188 fatcat:simadhkdurcphnqmexzohvivye


2013 Journal of Computer Science  
The accuracy in recognizing character differ by 10, 77%, with a success rate of 90, 77% for the optimized backpropagation and 80% accuracy for the standard backpropagation network.  ...  In this study, backpropagation network algorithm is combined with genetic algorithm to achieve both accuracy and training swiftness for recognizing alphabets.  ...  With a shorter amount of time for training, genetic algorithm optimized neural networks got more accurate recognition than the standard backpropagation.  ... 
doi:10.3844/jcssp.2013.1435.1442 fatcat:do4raryerfcjrgx3vhvw2lxway

Predicting the Number of COVID-19 Sufferers in Malang City Using the Backpropagation Neural Network with the Fletcher–Reeves Method

Syaiful Anam, Mochamad Hakim Akbar Assidiq Maulana, Noor Hidayat, Indah Yanti, Zuraidah Fitriah, Dwi Mifta Mahanani, Wan Hanna Melini
2021 Applied Computational Intelligence and Soft Computing  
Backpropagation, a type of ANN algorithm, offers predictive problem solving with good performance. However, its performance depends on the optimization method applied during the training process.  ...  Based on this hypothesis, this paper proposes a prediction model for the number of COVID-19 sufferers in Malang using the Backpropagation neural network with the Fletcher–Reeves method.  ...  Comparison between the number of COVID-19 sufferers from the actual data and the number of COVID-19 sufferers predicted by the Backpropagation algorithm for testing data.  ... 
doi:10.1155/2021/6658552 fatcat:bcc6qmoijvcv5jb56we3d4qxoa

NN-AirPol: a neural-networks-based method for air pollution evaluation and control

Ferhat Karaca, Alexander Nikov, Omar Alagha
2006 International Journal of Environment and Pollution  
As best among 11 backpropagation algorithms, the Levenberg-Marquardt algorithm was selected. The optimal architecture of NN-AirPol neural network was determined.  ...  The most popular neural networks, the backpropagation algorithms, were used to model the relationships between local meteorological data and air pollution indicators concentrations like sulphure dioxide  ...  As best among 11 backpropagation algorithms, the Levenberg-Marquardt algorithm was selected. The optimal architecture of NN-AirPol neural network was determined.  ... 
doi:10.1504/ijep.2006.011214 fatcat:6wp4yw5dtfazdn2g3bgewzmznq

Optimization of Backpropagation for Early Detection of Diabetes Mellitu

Rosita Sofiana, Sutikno Sutikno
2018 International Journal of Electrical and Computer Engineering (IJECE)  
Optimized backpropagation algorithm may allow the training process goes 12.4 times faster than standard backpropagation.  ...  This method is backpropagation with three optimization namely early initialization with Nguyen-Widrow algorithm, learning rate adaptive determination, and determination of weight change by applying momentum  ...  of the optimized backpropagation algorithm to identify diabetes mellitus.  ... 
doi:10.11591/ijece.v8i5.pp3232-3237 fatcat:ei2soj4355cxhe3cltrrph6kiq


Rajeshkumar J, Kousalya K
2017 International Research Journal of Pharmacy  
In this paper, combinations of whale optimization algorithm and backpropagation neural network methodology are integrated to diagnose diabetes mellitus.  ...  In the proposed methodology, Whale optimization technique develops new solutions in solution space and backpropagation algorithm finds the globally optimal solution.  ...  The following algorithm illustrates proposed whale optimization algorithm using backpropagation neural network for solving diabetic data classification problem.  ... 
doi:10.7897/2230-8407.0811242 fatcat:3vmfy6qu5zby5hrdoihxcb44je

Comparison of NEAT and Backpropagation Neural Network on Breast Cancer Diagnosis

Hamza Turabieh
2016 International Journal of Computer Applications  
In this paper we present a comparison between NeuroEvolution of Augmenting Typologies (NEAT) algorithm with Backpropagation Neural Network for the prediction of breast cancer.  ...  Machine learning algorithms could be used to enhance the performance of medical practitioners in the diagnosis of breast cancer.  ...  Table 5 presents a comparison between NEAT algorithm and other algorithms in the literature. It is clear that NEAT algorithm results outperform other algorithms.  ... 
doi:10.5120/ijca2016909245 fatcat:pufhpgvbgndw3m5ujdtiinjdli

Application of chaotic Fish School Search optimization algorithm with exponential step decay in neural network loss function optimization

L.A. Demidova, A.V. Gorchakov
2021 Procedia Computer Science  
Since the first mention of FSS, this effective optimization algorithm has been of a great interest among researches and practitioners around the globe.  ...  Since the first mention of FSS, this effective optimization algorithm has been of a great interest among researches and practitioners around the globe.  ...  Evolutionary optimization algorithms do not impose any additional limitations over the optimized function.  ... 
doi:10.1016/j.procs.2021.04.156 fatcat:i2gyosgtm5hzvcrsqcivl4hdwa
« Previous Showing results 1 — 15 out of 44,622 results