Filters








3,566 Hits in 2.4 sec

Training multilayer neural networks using fast global learning algorithm - least-squares and penalized optimization methods

Siu-yeung Cho, Tommy W.S. Chow
1999 Neurocomputing  
This paper presents a novel heuristics approach for neural networks global learning algorithm.  ...  Keywords: Multilayer neural networks; Global learning algorithm; Least-squares method; Penalized optimization 0925-2312/99/$ } see front matter 1999 Elsevier Science B.V. All rights reserved.  ...  Acknowledgements The authors would like to thank the anonymous reviewers for their useful comments and suggestions.  ... 
doi:10.1016/s0925-2312(99)00055-7 fatcat:clvxpqdn65dvvmfljj4nbbvg4a

The Research of Ant Colony and Genetic Algorithm in Grid Task Scheduling

Jing Liu, Li Chen, Yuqing Dun, Lingmin Liu, Ganggang Dong
2008 2008 International Conference on MultiMedia and Information Technology  
The proposed algorithm uses the GA to determine the weights of a multilayer feed-forward network with back-propagation learning.  ...  Load Balancing is an important factor in a grid system to improve the global throughput of Grid Resources.  ...  Simulated annealing is a meta-heuristic algorithm for the global optimization problem of locating a good approximation to the global optimum of a given function in a large search space.  ... 
doi:10.1109/mmit.2008.61 fatcat:23ipq4loozcs3lrsu6erqi74qm

Page 290 of The Journal of the Operational Research Society Vol. 60, Issue 2 [page]

2009 The Journal of the Operational Research Society  
A scaled conjugate gradient algorithm for fast supervised learning. Neural Networks 6: 525-533. Montgomery DC (1997). Design and Analysis of Experiments. John Wiley & Sons Inc.: New York.  ...  A scatter-search-based learning algorithm for neural network training. J Heuristics 2: 129-146. Kirkpatrick S, Gelatt Jr CD and Vecchi MP (1983). Optimization by simulated annealing.  ... 

H/sub ∞/-learning of layered neural networks

K. Nishiyama, K. Suzuki
2001 IEEE Transactions on Neural Networks  
Although the backpropagation (BP) scheme is widely used as a learning algorithm for multilayered neural networks, the learning speed of the BP algorithm to obtain acceptable errors is unsatisfactory in  ...  The aim of this paper is to propose H -learning as a novel learning rule and to derive new globally and locally optimized learning algorithms based on H -learning.  ...  H -LEARNING ALGORITHMS A fast and robust learning algorithm for training multilayered neural networks is derived by applying the H filter, which is equivalent to the Kalman filter in Krein space [24]  ... 
doi:10.1109/72.963763 pmid:18249956 fatcat:p3ivelt6pjg5fixa7nepdprfla

Neural Network Modeling for Evaluating Sodium Temperature of Intermediate Heat Exchanger of Fast Breeder Reactor

Subhra Rani Patra, R. Jehadeesan, S. Rajeswari, Indranil Banerjee, S. A. V Satya Murty, G. Padmakumar, M. Sai Baba
2012 Advances in Computing  
The back propagation (BP) algorithm is used for training the network.  ...  Further a model based on Radial Basis Function (RBF) neural network is developed and trained and the results are compared with standard back propagation algorithm.  ...  Sri S.C Chetal, Director, IGCAR, Kalpakkam for his constant support and guidance for this project.  ... 
doi:10.5923/j.ac.20120202.03 fatcat:asgm25sj7nfehjks25swhsgcqi

Central Point Crossover for Neuro-genetic Hybrids [chapter]

Soonchul Jung, Byung-Ro Moon
2004 Lecture Notes in Computer Science  
In this paper, we consider each neural network as a point in a multi-dimensional problem space and suggest a crossover that locates the central point of a number of neural networks.  ...  The experimental results of our neurogenetic algorithm overall showed better performance over the traditional multi-start heuristic and the genetic algorithm with a traditional crossover.  ...  It is the most popular algorithm for the supervised training of multilayer feed-forward networks due to its simple implementation and fast computational speed.  ... 
doi:10.1007/978-3-540-24854-5_124 fatcat:bqkblitdfjg7liqcb3e6obyzja

Page 5290 of Mathematical Reviews Vol. , Issue 92i [page]

1992 Mathematical Reviews  
The local constraints can be expressed as simple heuristic rules for network design.  ...  The activation rule governing neuron behavior is derived by breaking down the global constraints for a problem into local constraints for individual neurons.  ... 

Intelligent Control Algorithms in Power Industry

Vyacheslav V. Potekhin, Dmitry N. Pantyukhov, Dmitrii V. Mikheev
2017 EAI Endorsed Transactions on Energy Web  
The article considers a solution the problem of creating energy technologies for autonomous decentralized energy supply, using intelligent automated control systems.  ...  [Ref. 3] The structure of fuzzy output can be presented in a form of a multilayer neural network.  ...  This neural network, which structure is similar to feed forward neural network consists of 4 layers: Neuron weights for each layer, except output layer, are fixed and equals one.  ... 
doi:10.4108/eai.11-7-2017.152766 fatcat:v7xa3luu5faezor4rb4se5zrmu

Intelligent Control Algorithms in Power Industry

Vyacheslav V. Potekhin, Dmitry N. Pantyukhov, Dmitrii V. Mikheev
2017 EAI Endorsed Transactions on Energy Web  
The article considers a solution the problem of creating energy technologies for autonomous decentralized energy supply, using intelligent automated control systems.  ...  [Ref. 3] The structure of fuzzy output can be presented in a form of a multilayer neural network.  ...  This neural network, which structure is similar to feed forward neural network consists of 4 layers: Neuron weights for each layer, except output layer, are fixed and equals one.  ... 
doi:10.4108/eai.14-7-2017.152894 fatcat:36dy2pzsbnfsdilvrjehpijvyq

Neurodynamic programming: a case study of the traveling salesman problem

Jia Ma, Tao Yang, Zeng-Guang Hou, Min Tan, Derong Liu
2007 Neural computing & applications (Print)  
In essence, both of them try to learn an appropriate evaluation function on the basis of a finite amount of experience.  ...  From this perspective, two methods, temporal difference learning and approximate Sarsa, are presented in detail.  ...  It is inspired by the competitive neural networks of Kohonen. A more detailed survey of neural network algorithms can be found in [30] .  ... 
doi:10.1007/s00521-007-0127-5 fatcat:c7affnpjxnfjfm56o2mhhiluaq

A New Back-Propagation Neural Network Optimized with Cuckoo Search Algorithm [chapter]

Nazri Mohd. Nawi, Abdullah Khan, Mohammad Zubair Rehman
2013 Lecture Notes in Computer Science  
Back-propagation Neural Network (BPNN) algorithm is one of the most widely used and a popular technique to optimize the feed forward neural network training.  ...  This paper proposed a new meta-heuristic search algorithm, called cuckoo search (CS), based on cuckoo bird's behavior to train BP in achieving fast convergence rate and to avoid local minima problem.  ...  Back-Propagation Neural Network (BPNN) The Back-Propagation Neural Network (BPNN) is one of the most novel supervised learning ANN algorithm proposed by Rumelhart, Hinton and Williams in 1986 [35] .  ... 
doi:10.1007/978-3-642-39637-3_33 fatcat:hhskcnnoajgeddj5o3ittgdppe

An Online Backpropagation Algorithm with Validation Error-Based Adaptive Learning Rate [chapter]

Stefan Duffner, Christophe Garcia
2007 Lecture Notes in Computer Science  
We present a new learning algorithm for feed-forward neural networks based on the standard Backpropagation method using an adaptive global learning rate.  ...  The proposed algorithm is a heuristic method consisting of two phases.  ...  Introduction The Backpropagation (BP) algorithm [1] is probably the most popular learning algorithm for multilayer perceptron (MLP)-type neural architectures due to its simplicity and effectiveness.  ... 
doi:10.1007/978-3-540-74690-4_26 fatcat:h4saddgm5vgbpnz4xzr6btpcki

Metamodel-assisted ultra-fast memetic optimization of a PLL for WiMax and MMDS applications

Oleg Garitselov, Saraju P. Mohanty, Elias Kougianos, Oghenekarho Okobiah
2012 Thirteenth International Symposium on Quality Electronic Design (ISQED)  
The design flow relies on a multiple-layer feedforward neural network metamodel of the nano-CMOS circuit.  ...  This paper proposes a novel ultra-fast design flow that uses memetic-based optimization algorithms over neuralnetwork based non-polynomial metamodels for design-space exploration.  ...  MULTILAYER FEED-FORWARD NEURAL NETWORK BASED NONPOLYNOMIAL METAMODELING A multiple layer neural network (NN) consists of inputs, a nonlinear activation function in the hidden layer, and a linear activation  ... 
doi:10.1109/isqed.2012.6187552 dblp:conf/isqed/GaritselovMKO12 fatcat:3chjablcsbg4penood5uw2ga6u

Countering the Problem of Oscillations in Bat-BP Gradient Trajectory by Using Momentum [chapter]

Nazri Mohd. Nawi, M. Z. Rehman, Abdullah Khan
2013 Lecture Notes in Electrical Engineering  
Previously, a meta-heuristic search algorithm called Bat was proposed to train BPNN to achieve fast convergence in the neural network.  ...  Metaheuristic techniques have been recently used to counter the problems like slow convergence to global minima and network stagnancy in backpropagation neural network (BPNN) algorithm.  ...  ACKNOWLEDGEMENTS The researchers would like to thanks University Tun Hussein Onn Malaysia (UTHM) for supporting this project.  ... 
doi:10.1007/978-981-4585-18-7_12 fatcat:nscw5krh7jesxgypcyfzziy7hq

Simultaneous Evolution of Neural Network Topologies and Weights for Classification and Regression [chapter]

Miguel Rocha, Paulo Cortez, José Neves
2005 Lecture Notes in Computer Science  
Artificial Neural Networks (ANNs) are important Data Mining (DM) techniques.  ...  Competitive results were achieved when compared with a heuristic model selection and other DM algorithms.  ...  Conclusions In this work, a Simultaneous Evolutionary Neural Network (SENN) algorithm is proposed, aiming at the optimization of the neural structure and weights.  ... 
doi:10.1007/11494669_8 fatcat:tr36m2qvq5bhpmwi3czlspokby
« Previous Showing results 1 — 15 out of 3,566 results