5,591 Hits in 4.8 sec

Evolutionary Multi-task Learning for Modular Training of Feedforward Neural Networks [chapter]

Rohitash Chandra, Abhishek Gupta, Yew-Soon Ong, Chi-Keong Goh
2016 Lecture Notes in Computer Science  
In this paper, we present a multi-task learning for neural networks that evolves modular network topologies.  ...  In the past, neuro-evolution has shown promising performance for a number of real-world applications. Recently, evolutionary multitasking has been proposed for optimisation problems.  ...  Essentially, neuro-evolution employs an evolutionary algorithm (EA) that features operators such as selection, crossover and mutation for evolving the weights of the feedforward network.  ... 
doi:10.1007/978-3-319-46672-9_5 fatcat:jabt3znegnhxzjegsckdrgcwhi

The Need for Recurrent Learning Neural Network and Combine Pareto Differential Algorithm for Multi-Objective Optimization of Real Time Reservoir Operations

Abiodun Ajala, Josiah Adeyemo, Semiu Akanmu
2020 Journal of Soft Computing in Civil Engineering  
This review study, based on systematic literature analysis, presents the suitability of Recurrent Learning Neural Network (RLNN) and Combine Pareto Multi-objective Differential Evolution (CPMDE) algorithms  ...  It also presents the need for a hybrid RLNN-CPMDE, with the use of CPMDE in the development of RLNN learning data, for reservoir operation optimization in a multi-objective and real time environment.  ...  Recurrent learning neural network for optimizing real time reservoir operations RRLNN is an ANN algorithm built on the recurrent network architecture.  ... 
doi:10.22115/scce.2020.226578.1204 doaj:40d7cd8a08ce440fad9ebc7a1c1b2b60 fatcat:2kumgxh46vfmhoa4mkmzqnpgd4

RETRACTED: Evolutionary design of constructive multilayer feedforward neural network

Ching-Han Chen, Tun-Kai Yao, Chia-Ming Kuo, Chen-Yuan Chen
2012 Journal of Vibration and Control  
This paper proposes an evolutionary design methodology of multilayer feedforward neural networks based on constructive approach.  ...  Based on the constructive representation of multilayer feedforward neural networks, we use a genetic encoding method, after which the evolution process is elaborated for designing the optimal neural network  ...  Conclusion The evolutionary design methodology of multilayer feedforward neural networks is caracterized by the following virtues : -The neural network building blocks are of modular and hierarchical structure  ... 
doi:10.1177/1077546312456726 fatcat:uuhvn5r34beijcoz4pmouef3zu

NVIS: an interactive visualization tool for neural networks

Matthew J. Streeter, Matthew O. Ward, Sergio A. Alvarez, Robert F. Erbacher, Philip C. Chen, Jonathan C. Roberts, Craig M. Wittenbrink, Matti Grohn
2001 Visual Data Exploration and Analysis VIII  
members of a population of ANNs as they evolve under an evolutionary algorithm.  ...  The authors have made use of these features to obtain insights into both the workings of single neural networks and the evolutionary process, based upon which we consider NVIS to be an effective visualization  ...  ACKNOWLEDGEMENTS The authors thank Soraya Rana for helpful discussions and for providing a library for evolutionary strategies that was modified for use in the system described in this paper.  ... 
doi:10.1117/12.424934 fatcat:4tfczbohynfb7dyskzjjcpv37m

Memetic cooperative coevolution of Elman recurrent neural networks

Rohitash Chandra
2013 Soft Computing - A Fusion of Foundations, Methodologies and Applications  
This paper applies the memetic cooperative coevolution method for training recurrent neural networks on grammatical inference problems.  ...  Cooperative coevolution decomposes an optimisation problem into subcomponents and collectively solves them using evolutionary algorithms.  ...  The adaptive local search intensity gave good performance for training feedforward neural networks for pattern classification (Chandra et al. 2012b) .  ... 
doi:10.1007/s00500-013-1160-1 fatcat:opgcorxzxzacjezr4nnurhzbne

Hybrid artificial neural network

Nadia Nedjah, Ajith Abraham, Luiza M. Mourelle
2007 Neural computing & applications (Print)  
In the third paper, entitled "An ant colony optimization algorithm for continuous optimization: application to feed-forward neural network training", K. Socha and C.  ...  The results show that the best of our algorithms are comparable with gradient-based algorithms for neural network training and our algorithms compare favorably with a basic genetic algorithm.  ...  Grosan propose an ensemble of three learning algorithms namely an evolutionary artificial neural network, Takagi-Sugeno neuro-fuzzy system and an artificial neural network to solve the problem of parameter  ... 
doi:10.1007/s00521-007-0083-0 fatcat:ktd2xql7ybbuvaccf2z42duydm

A successful interdisciplinary course on coputational intelligence

G.K.K. Venayagamoorthy
2009 IEEE Computational Intelligence Magazine  
The author is grateful for the following departments at the Missouri University of Science and Technology for offering the experimental course on computational intelligence to their students: Electrical  ...  Acknowledgments The financial support from the National Science Foundation (USA) under the CAREER Grant: ECCS# 0348221 and CCLI Grant: DUE # 0633299 is gratefully acknowledged for the development of this  ...  Quantum inspired evolutionary algorithms were developed and compared with binary particle swarm optimization for training feedforward and recurrent neural networks on complex problems [27] .  ... 
doi:10.1109/mci.2008.930983 fatcat:4io5epgatzeh5njlywb4kfmjiy

Evolutionary Neural Networks with Mixed-Integer Hybrid Differential Evolution

Yung-Chin Lin, Yung-Chien Lin, Wen-Cheng Chang, Kuo-Lan Su
2011 Journal of Computers  
Index Terms-neural networks, mixed-integer optimization, evolutionary algorithm  ...  And then a mixed-integer evolutionary algorithm (Mixed-Integer Hybrid Differential Evolution, MIHDE) is used to optimize the neural network.  ...  And then a mixed-integer evolutionary algorithm, MIHDE, is used to optimize the weights and architectures of neural networks.  ... 
doi:10.4304/jcp.6.8.1591-1596 fatcat:x45xopjjajdhvapwchprrcxebu

Using different types of neural networks in detection the body's readiness for blood donation and determining the value of each of its parameters using genetic algorithm

Zahra Jafari, Asma Mahdavi Yousefi, Saman Rajabi
2020 Innovaciencia  
In this study, data is extracted from the blood transfusion service center and the perceptron neural network, RBF neural network, Fisher's discrimination ratio and genetic algorithm were examined, and  ...  Artificial neural network is a data processing system getting ideas from the human brain and designs a data structure that acts like a neuron using programming science and by creating a network between  ...  First, a single-layer perceptron neural network is designed for all features as shown in Figure 2 .  ... 
doi:10.15649/2346075x.998 fatcat:fqcpbxkxl5actjwh57k52moofy

Autonomous self-configuration of artificial neural networks for data classification or system control

Wolfgang Fink, Wolfgang Fink
2009 Space Exploration Technologies II  
Artificial neural networks (ANNs) are powerful methods for the classification of multi-dimensional data as well as for the control of dynamic systems.  ...  We report on the use of a Stochastic Optimization Framework (SOF; Fink, SPIE 2008) for the autonomous self-configuration of Artificial Neural Networks (i.e., the determination of number of hidden layers  ...  Fitness or Energy SA Step Tunneling Setup for Autonomous SOF-based ANN Architecture Design In the example case of a feedfoward network, the only fixed/specific parameters for an ANN architecture design  ... 
doi:10.1117/12.821836 fatcat:vsdedhnjnbdtzelpx5xwtdbyp4

A learning automata-based algorithm for determination of the number of hidden units for three-layer neural networks

Hamid Beigy, Mohamad Reza Meybodi
2009 International Journal of Systems Science  
There is no method to determine the optimal topology for multi-layer neural networks for a given problem. Usually the designer selects a topology for the network and then trains it.  ...  Since determination of the optimal topology of neural networks belongs to class of NP-hard problems, most of the existing algorithms for determination of the topology are approximate.  ...  MR A Fast Method for Determining the Number of Hidden Units in Feedforward Neural Networks,. Proceedings of CSICC-97, Tehran, Iran, 1, pp. 414-420.  ... 
doi:10.1080/00207720802145924 fatcat:n6tj7xezvvcwpcbij4advgzj3m

Delay nonlinear system predictive control on MPSO+DNN

Min Han, Jia Fan
2009 2009 IEEE International Conference on Systems, Man and Cybernetics  
This paper presents a novel dynamic neural network (DNN) predictive control strategy based on modified particle swarm optimization (PSO) for long time delay nonlinear process.  ...  An improved version of the original PSO is put forward to train the parameters of NN to enhance the convergence and accuracy.  ...  DYNAMIC FEEDFORWARD NEURAL NETWORKS Choose a multi-input, single-output feedforward neural network as an example to illustrate proposed dynamic feedforward Neural Networks (DNN), the structure is shown  ... 
doi:10.1109/icsmc.2009.5346799 dblp:conf/smc/HanF09 fatcat:lnxh6cexarg75p5rsu3lskkr4a

Research status and applications of nature-inspired algorithms for agri-food production

Yanbo Huang, USDA-ARS Crop Production Systems Research Unit, Stoneville, MS 38776, USA
2020 International Journal of Agricultural and Biological Engineering  
In this paper, the developments of artificial neural networks and deep learning algorithms are presented and discussed in conjunction with their biological connections for agri-food applications.  ...  Machine learning algorithms from artificial neurons and artificial neural networks have been developed to mimic the human brain with synthetic neurons.  ...  Artificial neural networks and soft computing Obviously, compared with hand-designed rule AI systems, ML offers an "open-world" scheme to design and develop new-generation AI systems.  ... 
doi:10.25165/j.ijabe.20201304.5501 fatcat:cq2moh4ep5b5phtgr7f4tixbje

Performance and efficiency: recent advances in supervised learning

Sheng Ma, Chuanyi Ji
1999 Proceedings of the IEEE  
for combinations of weak classifiers, and expectation and maximization algorithms for training static and dynamic neural networks.  ...  We discuss four types of learning approaches: training an individual model; combinations of several well-trained models; combinations of many weak models; and evolutionary computation of models.  ...  Zhao for relevant references and helpful discussion. Special thanks are also due to D. B. Fogel for valuable and detailed comments.  ... 
doi:10.1109/5.784228 fatcat:eb2avzs7gjgdvj5in75eqntwoq

Neural Network Design Using an Evolutionary Algorithm: A Financial Example

Tony Brabazon
2002 Accounting Finance & Governance Review  
Utilising financial time series data drawn from the FTSE 100 index, this paper demonstrates how an evolutionary algorithm, the genetic algorithm, can be applied in order to automate the process of determining  ...  This suggests that the application of fully connected, feedforward neural networks is not always appropriate and may result in network "bloat".  ...  and connection weights in a feedforward, neural network (NN) model.  ... 
doi:10.52399/001c.35313 fatcat:isto4s4ocvebfkpddiossl6bie
« Previous Showing results 1 — 15 out of 5,591 results