Filters








60,443 Hits in 5.2 sec

A Generalized Growing and Pruning RBF (GGAP-RBF) Neural Network for Function Approximation

G.-B. Huang, P. Saratchandran, N. Sundararajan
2005 IEEE Transactions on Neural Networks  
Simulation results for bench mark problems in the function approximation area show that the GGAP-RBF outperforms several other sequential learning algorithms in terms of learning speed, network size and  ...  Significance of a neuron is a measure of the average information content of that neuron.  ...  In sequential learning, a series of training samples are randomly drawn and presented to, and learned by the network one by one.  ... 
doi:10.1109/tnn.2004.836241 pmid:15732389 fatcat:zmp5ukcyenhiro2sl7ul22rgpi

A Survey on Study of Various Machine Learning Methods for Classification

S. Padma, R. Pugazendi
2015 International Journal of Database Theory and Application  
This article comprises a review of various sequential algorithms. The review represents the working nature of the learning methods.  ...  The previously said SRAN works on the basis of self regulatory mechanism in order to reduce the huge loss error and to maximize the class wise significance.  ...  The working starts with zero hidden neurons and adds neurons for learning to obtain a best network structure.  ... 
doi:10.14257/ijdta.2015.8.5.23 fatcat:ab3r4gy2xfbtxat3myc5wr73ti

Moving Learning Machine towards Fast Real-Time Applications: A High-Speed FPGA-Based Implementation of the OS-ELM Training Algorithm

Jose V. Frances-Villora, Alfredo Rosado-Muñoz, Manuel Bataller-Mompean, Juan Barrios-Aviles, Juan F. Guerrero-Martinez
2018 Electronics  
This performance enables high-speed sequential training ratios, such as 14 KHz of sequential training frequency for a 40 hidden neurons SLFN, or 180 Hz of sequential training frequency for a 500 hidden  ...  speed, but the number of trainings by a second (training frequency) achieved in these continuous learning applications has to be further reduced.  ...  In this context, in which the sequential arrival of new training data can be handled as an incremental training dataset, the on-line sequential learning is the most adequate way of learning.  ... 
doi:10.3390/electronics7110308 fatcat:vanxfwblsffoxdiakjjyv7efu4

An Efficient Sequential Learning Algorithm for Growing and Pruning RBF (GAP-RBF) Networks

G.-B. Huang, P. Saratchandran, N. Sundararajan
2004 IEEE Transactions on Systems Man and Cybernetics Part B (Cybernetics)  
In this paper, the performance of the GAP-RBF learning algorithm is compared with other well-known sequential learning algorithms like RAN, RANEKF, and MRAN on an artificial problem with uniform input  ...  The algorithm referred to as growing and pruning (GAP)-RBF uses the concept of "Significance" of a neuron and links it to the learning accuracy.  ...  In sequential learning, a series of training samples , are randomly drawn and presented one by one to the network.  ... 
doi:10.1109/tsmcb.2004.834428 pmid:15619929 fatcat:midlr7x2rzfrfnqmm45jstp4li

Wavelet Basis Function Neural Networks for Sequential Learning

Ning Jin, Derong Liu
2008 IEEE Transactions on Neural Networks  
A sequential learning algorithm for WBFNNs is presented and compared to the sequential learning algorithm of RBFNNs.  ...  Index Terms-Radial basis function neural network (RBFNN), sequential learning, wavelet basis function neural network (WBFNN).  ...  In sequential learning, a neural network is trained to approximate a function while a series of training sample pairs are randomly drawn and presented to the network.  ... 
doi:10.1109/tnn.2007.911749 pmid:18334370 fatcat:hvdt6wrsfbgw3cyxdcvzfn7fzq

Learning spatiotemporal signals using a recurrent spiking network that discretizes time

Amadeus Maes, Mauricio Barahona, Claudia Clopath, Blake A. Richards
2020 PLoS Computational Biology  
Learning to produce spatiotemporal sequences is a common task that the brain has to solve. The same neurons may be used to produce different sequential behaviours.  ...  Here, we propose a model where a spiking recurrent network of excitatory and inhibitory spiking neurons drives a read-out layer: the dynamics of the driver recurrent network is trained to encode time which  ...  During the first stage of learning, all neurons in each cluster receive the same input in a sequential manner.  ... 
doi:10.1371/journal.pcbi.1007606 pmid:31961853 fatcat:xlbuzh255nc2fp2bhfkuf4gdmq

Emergence of stable striatal D1R and D2R neuronal ensembles with distinct firing sequence during motor learning

Meng-jun Sheng, Di Lu, Zhi-ming Shen, Mu-ming Poo
2019 Proceedings of the National Academy of Sciences of the United States of America  
fired in a sequential manner, with more D1R and D2R neurons fired during the lever-pushing period and intertrial intervals (ITIs), respectively.  ...  This sequential firing pattern was specifically associated with the learned motor behavior, because it changed markedly when the trained mice performed other cued motor tasks.  ...  We observed marked changes in neuronal firing patterns during the training process, with a gradual emergence of stable neuronal ensembles of both D1R and D2R neurons that fired in a sequential manner,  ... 
doi:10.1073/pnas.1901712116 pmid:31072930 pmcid:PMC6561210 fatcat:2mhxatvgf5ai7mdmfs64vscie4

Learning spatiotemporal signals using a recurrent spiking network that discretizes time [article]

Amadeus Maes, Mauricio Barahona, Claudia Clopath
2019 arXiv   pre-print
Here, we propose a model where a spiking recurrent network of excitatory and inhibitory biophysical neurons drives a read-out layer: the dynamics of the driver recurrent network is trained to encode time  ...  Learning to produce spatiotemporal sequences is a common task that the brain has to solve. The same neural substrate may be used by the brain to produce different sequential behaviours.  ...  During spontaneous activity, clusters in the RNN reactivate in a sequential manner driving the learned sequence in the read-out neurons.  ... 
arXiv:1907.08801v2 fatcat:sh3zi4pnpbbs3pch6juezjkwum

Learning spatiotemporal signals using a recurrent spiking network that discretizes time [article]

Amadeus Maes, Mauricio Barahona, Claudia Clopath
2019 bioRxiv   pre-print
Learning to produce spatiotemporal sequences is a common task the brain has to solve. The same neural substrate may be used by the brain to produce different sequential behaviours.  ...  Here, we propose a model where a spiking recurrent network of excitatory and inhibitory biophysical neurons drives a read-out layer: the dynamics of the recurrent network is constrained to encode time  ...  To embed a feedforward structure, the network is stimulated in a sequential manner (Fig 2A) . Neurons in cluster i each receive external Poisson spike trains (rate of 18 kHz for 10 ms).  ... 
doi:10.1101/693861 fatcat:5ekvqxznvrayvnxzz276rle2ga

Somatostatin-Expressing Interneurons Enable and Maintain Learning-Dependent Sequential Activation of Pyramidal Neurons

Avital Adler, Ruohe Zhao, Myung Eun Shin, Ryohei Yasuda, Wen-Biao Gan
2019 Neuron  
We found that layer 2 and/or 3 pyramidal neurons (PNs) showed sequential activation in the mouse primary motor cortex during motor skill learning.  ...  Conversely, inactivating SST interneurons during the learning of a new motor task reversed sequential activities and behavioral improvement that occurred during a previous task.  ...  -B.G. and by HFSP Postdoctoral Fellowship to A. A.  ... 
doi:10.1016/j.neuron.2019.01.036 pmid:30792151 pmcid:PMC6555419 fatcat:4lu7e3gh7radhg66v7qgvajzwm

A neuromorphic categorization system with Online Sequential Extreme Learning

Ruoxi Ding, Bo Zhao, Shoushun Chen
2014 2014 IEEE Biomedical Circuits and Systems Conference (BioCAS) Proceedings  
The extracted spike patterns are then classified by an Online Sequential Extreme Learning Machine with Auto Encoder.  ...  Experimental results show that the proposed system has a very fast training speed while still maintaining a competitive accuracy.  ...  The recursive learning approach presented in (10) and (11) consists of two phases: (1) Initialization Phase; (2) Sequential Learning. B.  ... 
doi:10.1109/biocas.2014.6981780 dblp:conf/biocas/DingZC14 fatcat:247gx2gckfc6bp43cqey2xv3fu

The Roles of the Cortical Motor Areas in Sequential Movements

Machiko Ohbayashi
2021 Frontiers in Behavioral Neuroscience  
Studies in human and non-human primates have shown that a brain-wide distributed network is active during the learning and performance of skilled sequential movements.  ...  For instance, sequential movements are initially learned relatively fast and later learned more slowly.  ...  FIGURE 2 | 2 (A) Apparatus of the training setup. A monkey sits in front of a touch-sensitive monitor.  ... 
doi:10.3389/fnbeh.2021.640659 pmid:34177476 pmcid:PMC8219877 fatcat:fwvdehxhf5eg7grtdjf45ytccu

Spatial vs temporal continuity in view invariant visual object recognition learning

Gavin Perry, Edmund T. Rolls, Simon M. Stringer
2006 Vision Research  
We show in a 4-layer competitive neuronal network that continuous transformation learning, which uses spatial correlations and a purely associative (Hebbian) synaptic modification rule, can build view  ...  This occurs even when views of the different objects are interleaved, a condition where temporal trace learning fails.  ...  new permuted training condition in which the views of a given object were presented in permuted rather than sequential order.  ... 
doi:10.1016/j.visres.2006.07.025 pmid:16996556 fatcat:rmjtnqv2crconepxnprbqnzaza

Neural networks' learning process acceleration

L. Katerynych, Kiev Taras Shevchenko National University, M. Veres, E. Safarov, Kiev Taras Shevchenko National University, Kiev Taras Shevchenko National University
2020 PROBLEMS IN PROGRAMMING  
This study is devoted to evaluating the process of training of a parallel system in the form of an artificial neural network, which is built using a genetic algorithm.  ...  The performance of sequential and parallel training processes of artificial neural network is compared.  ...  The performance of a sequential version of the ANN, after completing one iteration of GA training, considering the first 100 iterations of learning is in fig. 8 . Fig. 8 .  ... 
doi:10.15407/pp2020.02-03.313 fatcat:bq7q7ccfbjbgna3hmjlvbf7ybq

A Unified Approach to Sequential Constructive Methods [chapter]

Marco Muselli
1999 Perspectives in Neural Computing  
A general treatment of a particular class of learning techniques for neural networks, called sequential constructive methods, is proposed.  ...  They subsequently add units to the hidden layer until all the input-output relations contained in a given training set are satisfied.  ...  the samples contained in a given training set S.  ... 
doi:10.1007/978-1-4471-0811-5_41 fatcat:zdsg4fwnx5fqzdoq5m7ffw2kre
« Previous Showing results 1 — 15 out of 60,443 results