Filters








301 Hits in 4.0 sec

Artificial neural networks for neuroscientists: A primer [article]

Guangyu Robert Yang, Xiao-Jing Wang
2020 Neuron   accepted
Artificial neural networks (ANNs) are essential tools in machine learning that have drawn increasing attention in neuroscience.  ...  well as to explore optimization in neural systems, in ways that traditional models are not designed for.  ...  Figure 3 : 3 Comparing the visual system and deep convolutional neural networks.  ... 
doi:10.1016/j.neuron.2020.09.005 pmid:32970997 arXiv:2006.01001v2 fatcat:jucqkw2kufaerabonik6udtbfi

If deep learning is the answer, then what is the question? [article]

Andrew Saxe, Stephanie Nelli, Christopher Summerfield
2020 arXiv   pre-print
This perspective has the potential to radically reshape our approach to understanding neural systems, because the computations performed by deep networks are learned from experience, not endowed by the  ...  Many researchers are excited by the possibility that deep neural networks may offer theories of perception, cognition and action for biological brains.  ...  convolutional neural network.  ... 
arXiv:2004.07580v2 fatcat:2ltmlfs4xbdhvhh7qcga7rcbq4

Large-Scale Gradient-Free Deep Learning with Recursive Local Representation Alignment [article]

Alexander Ororbia, Ankur Mali, Daniel Kifer, C. Lee Giles
2020 arXiv   pre-print
Training deep neural networks on large-scale datasets requires significant hardware resources whose costs (even on cloud platforms) put them out of reach of smaller organizations, groups, and individuals  ...  In this paper, we propose a gradient-free learning procedure, recursive local representation alignment, for training large-scale neural architectures.  ...  Understanding the difficulty of training deep feedforward neural networks.  ... 
arXiv:2002.03911v3 fatcat:2k326rdnnjhutfmthzqrvrsxui

Artificial Neurogenesis: An Introduction and Selective Review [chapter]

Taras Kowaliw, Nicolas Bredeche, Sylvain Chevallier, René Doursat
2014 Studies in Computational Intelligence  
Deep Learning With the advent of deep learning, neural networks have made headlines again both in the machine learning community and publicly, to the point that "deep networks" could be seen on the cover  ...  Before deep learning, most multilayered neural nets contained only one hidden layer, with the notable exception of LeCun's convolutional network [171] (see below).  ... 
doi:10.1007/978-3-642-55337-0_1 fatcat:xx6nzfvbmfgzjhse6t5il3lbxe

Texture Image Classification Method of Porcelain Fragments Based on Convolutional Neural Network

Hongchang Wu
2021 Computational Intelligence and Neuroscience  
The texture image decomposition of porcelain fragments based on convolutional neural network is a functional algorithm based on energy minimization.  ...  This paper conducts a systematic research on image decomposition based on variational method and compressed sensing reconstruction of convolutional neural network.  ...  Among the learning-based methods, the convolutional neural network is a typical deep learning model.  ... 
doi:10.1155/2021/1823930 doaj:b8c28dbc13714fb886008bca79d570c3 fatcat:q4zc6ge745e6dbzb7jdydvlapq

Training spiking neural networks using reinforcement learning [article]

Sneha Aenugu
2020 arXiv   pre-print
learning processes in the brain.  ...  In this project, we propose biologically-plausible alternatives to backpropagation to facilitate the training of spiking neural networks.  ...  Deep neural networks are a viable means to learn these mappings due to the potential for rich feature representations afforded by the hierarchical structure of these networks.  ... 
arXiv:2005.05941v1 fatcat:f4d6y642x5fvnjb7iftfpwwyau

Hardware-Aware Design for Edge Intelligence

Warren J. Gross, Brett H. Meyer, Arash Ardakani
2020 IEEE Open Journal of Circuits and Systems  
INDEX TERMS Artificial intelligence, deep neural networks, hardware and systems, neural architecture search, quantization and pruning, stochastic computing, surveys and reviews.  ...  With the rapid growth of the number of devices connected to the Internet, there is a trend to move intelligent processing of the generated data with deep neural networks (DNNs) from cloud servers to the  ...  Deep belief networks and Hebbian networks are feed-forward neural networks that use the contrastive divergence learning algorithm and Hebbian learning rule to update their weights.  ... 
doi:10.1109/ojcas.2020.3047418 fatcat:d5u57awixzgl3au7fk5hh2gezu

2020 Index IEEE Transactions on Neural Networks and Learning Systems Vol. 31

2020 IEEE Transactions on Neural Networks and Learning Systems  
Liu, X., +, TNNLS Dec. 2020 5312-5323 Hebbian learning Contrastive Hebbian Feedforward Learning for Neural Networks.  ...  Xu, Y., +, TNNLS Nov. 2020 4892-4906 Contrastive Hebbian Feedforward Learning for Neural Networks.  ...  ., +, TNNLS Oct. 2020 3777-3787 On the Working Principle of the Hopfield Neural Networks and its Equivalence to the GADIA in Optimization. Uykan, Z.,  ... 
doi:10.1109/tnnls.2020.3045307 fatcat:34qoykdtarewhdscxqj5jvovqy

A Biologically Plausible Supervised Learning Method for Spiking Neural Networks Using the Symmetric STDP Rule [article]

Yunzhe Hao, Xuhui Huang, Meng Dong, Bo Xu
2019 arXiv   pre-print
Spiking neural networks (SNNs) possess energy-efficient potential due to event-based computation. However, supervised training of SNNs remains a challenge as spike activities are non-differentiable.  ...  The former methods are dependent on energy-inefficient real-valued computation and non-local transmission, as also required in artificial neural networks (ANNs), whereas the latter are either considered  ...  In our model, each hidden layer excitatory neuron has a global receptive field, which differs from that in deep convolutional neural networks (CNNs), where each convolutional layer neuron has a common  ... 
arXiv:1812.06574v2 fatcat:gj4f4jm5d5atbatrsab6pvvgua

Designing neural networks through neuroevolution

Kenneth O. Stanley, Jeff Clune, Joel Lehman, Risto Miikkulainen
2019 Nature Machine Intelligence  
Much of recent machine learning has focused on deep learning, in which neural network weights are trained through variants of stochastic gradient descent.  ...  Neuroevolution enables important capabilities that are typically unavailable to gradient-based approaches, including learning neural network building blocks (for example activation functions), hyperparameters  ...  Current neural network research is largely focused on the fields of 'deep learning' 1,2 and 'deep reinforcement learning' 3, 4 .  ... 
doi:10.1038/s42256-018-0006-z fatcat:gkcu2s7bjvhnxotnhexpwpdyzu

Editorial Biologically Learned/Inspired Methods for Sensing, Control, and Decision

Yongduan Song, Jennie Si, Sonya Coleman, Dermot Kerr
2022 IEEE Transactions on Neural Networks and Learning Systems  
In [A20] , Ladosz et al. present a new bioinspired neural architecture that combines a modulated Hebbian network (MOHN) with deep Q-network (DQN), called as modulated Hebbian plus Q-network architecture  ...  artificial neural networks and biological neural networks.  ... 
doi:10.1109/tnnls.2022.3161003 fatcat:4e6v2kclcbb5pgkqqsyyaiwzjy

ARTFLOW: A Fast, Biologically Inspired Neural Network That Learns Optic Flow Templates for Self-Motion Estimation

Oliver W. Layton
2021 Sensors  
ARTFLOW trains substantially faster and yields self-motion estimates that are far more accurate than a comparable network that relies on Hebbian learning.  ...  Here I present ARTFLOW, a biologically inspired neural network that learns patterns in optic flow to encode the observer's self-motion.  ...  This local processing architecture bears similarities to convolutional neural networks (CNNs), which excel at processing sensory data.  ... 
doi:10.3390/s21248217 pmid:34960310 pmcid:PMC8708706 fatcat:ittb3rt7wndofoaz3rskjy4riq

How much data is needed to train a medical image deep learning system to achieve necessary high accuracy? [article]

Junghwan Cho, Kyewook Lee, Ellie Shin, Garry Choy, Synho Do
2016 arXiv   pre-print
The use of Convolutional Neural Networks (CNN) in natural image classification systems has produced very impressive results.  ...  Combined with the inherent nature of medical images that make them ideal for deep-learning, further application of such systems to medical image classification holds much promise.  ...  principle and multi-scale processing as compared to other convolutional neural networks (CNN).  ... 
arXiv:1511.06348v2 fatcat:jzdjhrtk7zey3kw7hi5f3s4mua

A Comparative Study of Deep Learning Models for Network Intrusion Detection

Jyoti Khurana, Vachali Aggarwal, Harjinder Singh
2021 International Journal of Computer Applications  
This paper concerns the comparative evaluation of the several techniques of deep learning employed for network intrusion detection.  ...  Artificial Intelligence (AI) and Deep Learning (DL) can help address these concerns by contributing to threat detection.  ...  HAST-IDS is a hierarchical network of CNN (Convolutional Neural Network) [17] to unsheathe spatial features and RNN (Recurrent Neural Network) [18] Pelican: A Deep Residual Network for Network Intrusion  ... 
doi:10.5120/ijca2021921135 fatcat:3naflojebrcxbkoipsg5x2aize

Spiking Neural Networks for Inference and Learning: A Memristor-based Design Perspective [article]

M. E. Fouda, F. Kurdahi, A. Eltawil, E. Neftci
2019 arXiv   pre-print
This chapter reviews the recent development in learning with spiking neural network models and their possible implementation with memristor-based hardware.  ...  However, interdisciplinary approaches anchored in machine learning theory suggest that multifactor plasticity rules matching neural and synaptic dynamics to the device capabilities can take better advantage  ...  Gradient-based Learning in Spiking Neural Network and Three-Factor Rules Three-factor rules can be viewed as extensions of Hebbian learning and STDP, and are derived from a normative approach [Urbanczik  ... 
arXiv:1909.01771v2 fatcat:gz4ndpuzi5hytha5572bum3o4y
« Previous Showing results 1 — 15 out of 301 results