A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2019; you can also visit the original URL.
The file type is application/pdf
.
Filters
Multi-step ahead nonlinear identification of Lorenz's chaotic system using radial basis neural network with learning by clustering and particle swarm optimization
2008
Chaos, Solitons & Fractals
This paper presents a hybrid training approach based on clustering methods (k-means and c-means) to tune the centers of Gaussian functions used in the hidden layer of RBF-NNs. ...
This design also uses particle swarm optimization (PSO) for centers (local clustering search method) and spread tuning, and the Penrose-Moore pseudoinverse for the adjustment of RBF-NN weight outputs. ...
Neural networks are originally inspired by the functionality of biological neural networks, which are able to learn complex functional relations based on a limited number of training data. ...
doi:10.1016/j.chaos.2006.05.077
fatcat:adgdrsvfcna7vaipsjlgiy7knm
Towards conservative helicopter loads prediction using computational intelligence techniques
2012
The 2012 International Joint Conference on Neural Networks (IJCNN)
Hybrid and memetic approaches combining the deterministic optimization and evolutionary computation techniques were also explored. ...
Hybrid models using PSO and LM learning provided accurate and correlated predictions for the main rotor loads in both flight conditions. ...
Hybrid and Memetic Approaches A common issue of deterministic (gradient-based) techniques is the local entrapment problem which can be mitigated by combining local and global search techniques. ...
doi:10.1109/ijcnn.2012.6252624
dblp:conf/ijcnn/ValdesCL12
fatcat:svehus5lzff3tjykkweqlrriya
A System for Robotic Heart Surgery that Learns to Tie Knots Using Recurrent Neural Networks
2008
Advanced Robotics
Current solutions to this problem replay manually programmed trajectories, but a more general and robust approach is to use supervised machine learning to smooth surgeongiven training trajectories and ...
Instead we exploit more powerful, recurrent neural networks (RNNs) with adaptive internal states. ...
ACKNOWLEDGMENTS This research was partially funded by SNF grant 200020-107534 and the EU MindRaces project FP6 511931. ...
doi:10.1163/156855308x360604
fatcat:ijy7p7d3x5eodmkvp2zpokkrry
A System for Robotic Heart Surgery that Learns to Tie Knots Using Recurrent Neural Networks
2006
2006 IEEE/RSJ International Conference on Intelligent Robots and Systems
Current solutions to this problem replay manually programmed trajectories, but a more general and robust approach is to use supervised machine learning to smooth surgeongiven training trajectories and ...
Instead we exploit more powerful, recurrent neural networks (RNNs) with adaptive internal states. ...
ACKNOWLEDGMENTS This research was partially funded by SNF grant 200020-107534 and the EU MindRaces project FP6 511931. ...
doi:10.1109/iros.2006.282190
dblp:conf/iros/MayerGWNKS06
fatcat:qkmvbfl7gzfupajt7heh4lc4ke
Evolutionary Optimization of RBF Networks
2001
International Journal of Neural Systems
For such, it presents an overall view of the problems involved and the different approaches used to genetically optimize RBF networks. ...
One of the main obstacles to the widespread use of artijcial neural networks is the difJiculty of adequately define values f o r their free parameters. ...
Acknowledgment: The authors would like to thank FAPESP and CNPq for their support. ...
doi:10.1016/s0129-0657(01)00073-4
fatcat:bezrxwibrzcatbort3f33orz6q
EVOLUTIONARY OPTIMIZATION OF RBF NETWORKS
2001
International Journal of Neural Systems
For such, it presents an overall view of the problems involved and the different approaches used to genetically optimize RBF networks. ...
One of the main obstacles to the widespread use of artijcial neural networks is the difJiculty of adequately define values f o r their free parameters. ...
Acknowledgment: The authors would like to thank FAPESP and CNPq for their support. ...
doi:10.1142/s0129065701000734
pmid:11577381
fatcat:mjy5ozvisvelfkp5a7zvrc46cu
A parameter optimization method for radial basis function type models
2003
IEEE Transactions on Neural Networks
A structured nonlinear parameter optimization method (SNPOM) adapted to radial basis function (RBF) networks and an RBF network-style coefficients Au-toRegressive model with eXogenous variable (RBF-ARX ...
Index Terms-Identification, nonlinear systems, parameter estimation, radial basis function (RBF), AutoRegressive model with eXogenous variable (RBF-ARX), RBF neural network, state-dependent model. ...
ACKNOWLEDGMENT The authors would like to thank the editors and the anonymous referees for their valuable comments. ...
doi:10.1109/tnn.2003.809395
pmid:18238025
fatcat:zmv24z67uvbbpnjjueorzc3qee
Training Recurrent Networks by Evolino
2007
Neural Computation
Sometimes, however, gradient information is of little use for training RNNs, due to numerous local minima. ...
In recent years, gradient-based LSTM recurrent neural networks (RNNs) solved many previously RNN-unlearnable tasks. ...
However, for the present Evolino variant, where fine local search is desirable, ESP uses Cauchy-distributed mutation to produce all new individuals, making the approach in effect an Evolution Strategy ...
doi:10.1162/neco.2007.19.3.757
pmid:17298232
fatcat:vm7iizahmvdunjzhwmeb4tbc4u
Efficient Training Algorithms for a Class of Shunting Inhibitory Convolutional Neural Networks
2005
IEEE Transactions on Neural Networks
Index Terms-Convolutional neural network (CoNN), first-and second-order training methods, shunting inhibitory neuron. ...
shunting inhibitory convolution neural networks. ...
In case of a change in the sign of the local gradient and an increase in the network error, a backtracking process is also included to revert back to the previous weight, which is multiplied by an adaptive ...
doi:10.1109/tnn.2005.845144
pmid:15940985
fatcat:pueeezqfp5gz7b3tzyh4tbnww4
Studies on Optimization Algorithms for Some Artificial Neural Networks Based on Genetic Algorithm (GA)
2011
Journal of Computers
Firstly, an optimizing BP neural network is set up. It is using GA to optimize the connection weights of the neural network, and using GA to optimize both the connection weights and the architecture. ...
Secondly, an optimizing RBF neural network is proposed. ...
EAs are extendable and easy to hybridize. (3) EAs are directed stochastic global search. ...
doi:10.4304/jcp.6.5.939-946
fatcat:mvmk2pm2izgrxi6xvtrjjokjwy
A hybrid algorithm to optimize RBF network architecture and parameters for nonlinear time series prediction
2012
Applied Mathematical Modelling
Performance of the presented hybrid approach is evaluated by several benchmark time series modeling and prediction problems. ...
This paper proposes a novel hybrid algorithm for automatic selection of the proper input variables, the number of hidden nodes of the radial basis function (RBF) network, and optimizing network parameters ...
The SNPOM is used as the local search algorithm to improve the convergence rate and modeling accuracy. ...
doi:10.1016/j.apm.2011.09.066
fatcat:6ejyydlk5rh3havwnfrkgisqcq
Learning Neural Networks for Visual Servoing Using Evolutionary Methods
2006
2006 Sixth International Conference on Hybrid Intelligent Systems (HIS'06)
In this article we introduce a method to learn neural networks that solve a visual servoing task. ...
Our method, called EANT, Evolutionary Acquisition of Neural Topologies, starts from a minimal network structure and gradually develops it further using evolutionary reinforcement learning. ...
The authors also wish to thank Nikolaus Hansen, the developer of CMA-ES, for his kind support which helped us to quickly start applying his method. ...
doi:10.1109/his.2006.264889
fatcat:v4jus2bidnhjtc7pxqbuizd6ei
A Hybrid HMM-Based Speech Recognizer Using Kernel-Based Discriminants as Acoustic Models
2006
18th International Conference on Pattern Recognition (ICPR'06)
We integrate this method in a hybrid HMM-based speech recognition system by translating the outputs of the kernel-based classifier into class-conditional probabilities and using them instead of Gaussian ...
In this paper we propose a novel order-recursive training algorithm for kernel-based discriminants which is computationally efficient. ...
Artificial Neural Networks [3] have been used in the past. In the last few years kernel-based methods have shown an excellent performance in pattern recognition tasks [13] . ...
doi:10.1109/icpr.2006.82
dblp:conf/icpr/AndelicSKK06
fatcat:lfyn5ddwsjevfgmb3rq2kjg7bq
Optimized Random Vector Functional Link network to predict oil production from Tahe oil field in China
2020
Oil & Gas Science and Technology
The Spherical Search Optimizer (SSO) is applied to optimize the RVFL and to enhance its performance, where SSO works as a local search method that improved the parameters of the RVFL. ...
In this study, we propose an improved Random Vector Functional Link (RVFL) network to predict oil production from Tahe oil field in China. ...
"mode" is the approach applied to enhance the weights Moore-Penrose pseudoinverse and regularized least square. ...
doi:10.2516/ogst/2020081
fatcat:hodmykabhrfwvcmo2rst24zrky
Multiobjective Cuckoo Search Applied to Radial Basis Function Neural Networks Training for System Identification
2014
IFAC Proceedings Volumes
The multicriteria problem is solved by means of the multiobjective cuckoo search, which is based on an archiving technique and the crowding distance metric. ...
Focusing in the specific case of radial basis functions neural networks models, we insert the choice of the model complexity and its inputs in the optimization procedure together with the model parameters ...
ACKNOWLEDGEMENTS This study was partially supported by the Brazilian National Council of Scientific and Technological Development (CNPq) under Grants 479764/2013-1 and 307150/2012-7/PQ and by CAPES (Brazilian ...
doi:10.3182/20140824-6-za-1003.01249
fatcat:weacswrakncfrlvnvcvqchbgn4
« Previous
Showing results 1 — 15 out of 327 results