Filters








4,638 Hits in 4.9 sec

Neural coordination can be enhanced by occasional interruption of normal firing patterns: A self-optimizing spiking neural network model [article]

Alexander Woodward, Tom Froese, Takashi Ikegami
2014 arXiv   pre-print
Here we demonstrate that it can be transferred to more biologically plausible neural networks by implementing a self-optimizing spiking neural network model.  ...  In addition, by using this spiking neural network to emulate a Hopfield network with Hebbian learning, we attempt to make a connection between rate-based and temporal coding based neural systems.  ...  Through the self-optimization process our spiking neural network tends to go to an optimal attractor, which may correspond to a functionally efficacious CA in the sense of Varela.  ... 
arXiv:1409.0470v1 fatcat:vk4zucyl2vbx7bs7p5d6vitqie

Spiking neural networks trained via proxy [article]

Saeed Reza Kheradpisheh, Maryam Mirsadeghi, Timothée Masquelier
2021 arXiv   pre-print
We propose a new learning algorithm to train spiking neural networks (SNN) using conventional artificial neural networks (ANN) as proxy.  ...  The forward passes of the two networks are totally independent.  ...  works: Bringing the power of gradient-based optimization to spiking neural networks, IEEE [19] S. R. Kheradpisheh and T.  ... 
arXiv:2109.13208v2 fatcat:wlfdusom7nc5xomemafs244xum

RRAM based neuromorphic algorithms [article]

Roshan Gopalakrishnan
2019 arXiv   pre-print
This report mainly talks about the work on deep neural network to spiking neural network conversion and its significance.  ...  This report basically gives an overview of the algorithms implemented on neuromorphic hardware with crossbar array of RRAM synapses.  ...  Conversion of DNN to the spike-based domain: Spiking Deep Neural Network (SDNN) In a conventional CPU or GPU, it requires more time and energy to run a SDNN, whereas the power consumption and computational  ... 
arXiv:1903.02519v1 fatcat:kjb5c4e5yfhkjb7cft4nwgfqfi

Deep learning in spiking neural networks

Amirhossein Tavanaei, Masoud Ghodrati, Saeed Reza Kheradpisheh, Timothée Masquelier, Anthony Maida
2019 Neural Networks  
In this approach, a deep (multilayer) artificial neural network (ANN) is trained in a supervised manner using backpropagation.  ...  Spiking neural networks (SNNs) are thus more biologically realistic than ANNs, and arguably the only viable option if one wants to understand how the brain computes.  ...  In comparison to true biological networks, the network dynamics of artificial SNNs are highly simplified.  ... 
doi:10.1016/j.neunet.2018.12.002 fatcat:nfat4xwh5bdtfhauugyqpxhnzq

Training deep neural networks for binary communication with the Whetstone method

William Severa, Craig M. Vineyard, Ryan Dellana, Stephen J. Verzi, James B. Aimone
2019 Nature Machine Intelligence  
To date, the majority of artificial neural networks have not operated using discrete spike-like communication.  ...  We present a method for training deep spiking neural networks using an iterative modification of the backpropagation optimization algorithm.  ...  Any subjective views or opinions that might be expressed in the paper do not necessarily represent the views of the U.S. Department of Energy or the United States Government.  ... 
doi:10.1038/s42256-018-0015-y fatcat:pcoqu5ekibbcjj4q2mropjjlfe

Integration and co-design of memristive devices and algorithms for artificial intelligence

Wei Wang, Wenhao Song, Peng Yao, Yang Li, Joseph Van Nostrand, Qinru Qiu, Daniele Ielmini, J. Joshua Yang
2020 iScience  
However, these similarities do not directly transfer to the success of efficient computation without device and algorithm co-designs and optimizations.  ...  Such co-design and optimization have been the main focus of memristive neuromorphic engineering, which often abandons the "non-ideal" behaviors of memristive devices, although many of them resemble what  ...  With limited conductance states, the conventional artificial neural network needs to be adapted.  ... 
doi:10.1016/j.isci.2020.101809 pmid:33305176 pmcid:PMC7718163 fatcat:bibhecux2nafzjexaklossadae

Event-based Signal Processing for Radioisotope Identification [article]

Xiaoyu Huang, Edward Jones, Siru Zhang, Steve Furber, Yannis Goulermas, Edward Marsden, Ian Baistow, Srinjoy Mitra, Alister Hamilton
2020 arXiv   pre-print
This paper identifies the problem of unnecessary high power overhead of the conventional frame-based radioisotope identification process and proposes an event-based signal processing process to address  ...  It also presents the design flow of the neuromorphic processor.  ...  Due to their spatiotemporal nature and intermediate level of abstraction between biological plausibility and the Artificial Neural Networks (ANNs) of machine learning, Spiking Neural Networks (SNNs) are  ... 
arXiv:2007.05686v2 fatcat:eoavotv4tnadxp7hqsd3vadbku

Conversion of Artificial Recurrent Neural Networks to Spiking Neural Networks for Low-power Neuromorphic Hardware [article]

Peter U. Diehl, Guido Zarrella, Andrew Cassidy, Bruno U. Pedroni and Emre Neftci
2016 arXiv   pre-print
of artificial neurons with those of the spiking neurons.  ...  Recurrent neural networks (RNN) are widely used in machine learning to solve a variety of sequence learning tasks.  ...  Acknowledgments We thank the organizers and the participants of the Telluride Neuromorphic Cognition Engineering Workshop 2015, and especially the natural language processing group and Rodrigo Alvarez,  ... 
arXiv:1601.04187v1 fatcat:5jwrs5dhq5eghbzme7wb7wnoby

RetinaNet Object Detector based on Analog-to-Spiking Neural Network Conversion [article]

Joaquin Royo-Miquel, Silvia Tolu, Frederik E. T. Schöller, Roberto Galeazzi
2021 arXiv   pre-print
The paper proposes a method to convert a deep learning object detector into an equivalent spiking neural network.  ...  The aim is to provide a conversion framework that is not constrained to shallow network structures and classification problems as in state-of-the-art conversion libraries.  ...  ACKNOWLEDGMENTS The research leading to this paper was conducted within the ShippingLab research program [13] sponsored by the Danish Innovation Fund, The Danish Maritime Fund, Orients Fund and the Lauritzen  ... 
arXiv:2106.05624v2 fatcat:usghcmb2ibf3hbby2tfryfdouu

State-of-the-art deep learning has a carbon emission problem. Can neuromorphic engineering help?

Evangelos Stromatias
2020 Dialogues in Clinical Neuroscience & Mental Health  
While a method to train neural networks directly on neuromorphic devices has yet to be discovered it has already been demonstrated that executing trained neural networks on neuromorphic platforms comes  ...  Deep learning has attracted a lot of attention from both academic, as well as, industrial parties mainly due to its success when working large datasets and its ability to improve performance by scaling  ...  Artificial neurons when combined together with other artificial neurons, form Artificial Neural Networks (ANNs), while the various ways that artificial neurons can be combined together give rise to the  ... 
doi:10.26386/obrela.v3i3.166 doaj:5adeafdcbdee4dcc8e4c8e4013f45188 fatcat:h3sjpkrupbcqhpjuxkzshqkxca

Fast-classifying, high-accuracy spiking deep networks through weight and threshold balancing

Peter U. Diehl, Daniel Neil, Jonathan Binas, Matthew Cook, Shih-Chii Liu, Michael Pfeiffer
2015 2015 International Joint Conference on Neural Networks (IJCNN)  
However, this has come at the cost of performance losses due to the conversion from analog neural networks (ANNs) without a notion of time, to sparsely firing, event-driven SNNs.  ...  We present a set of optimization techniques to minimize performance loss in the conversion process for ConvNets and fully connected deep networks.  ...  Training of spiking deep networks typically does not use spike-based learning rules, but instead starts from a conventional ANN, fully trained with backpropagation, followed by a conversion of the rate-based  ... 
doi:10.1109/ijcnn.2015.7280696 dblp:conf/ijcnn/DiehlNB0LP15 fatcat:qdomfyjbkvb6nkkytlzfb3thrq

Conversion of artificial recurrent neural networks to spiking neural networks for low-power neuromorphic hardware

Peter U. Diehl, Guido Zarrella, Andrew Cassidy, Bruno U. Pedroni, Emre Neftci
2016 2016 IEEE International Conference on Rebooting Computing (ICRC)  
Acknowledgments We thank the organizers and the participants of the Telluride Neuromorphic Cognition Engineering Workshop 2015, and especially the natural language processing group and Rodrigo Alvarez,  ...  Those networks are pre-trained on a conventional computer and then converted to spiking neural networks (SNN) [6] .  ...  Some of the issues that arise during the conversion of Elman recurrent networks to spiking neural networks have been addressed by the conversion of convolutional neural networks and fully-connected networks  ... 
doi:10.1109/icrc.2016.7738691 dblp:conf/icrc/DiehlZCPN16 fatcat:abq5rlfc5ff6lcjjqe3dqk4oey

Deep Learning With Spiking Neurons: Opportunities and Challenges

Michael Pfeiffer, Thomas Pfeil
2018 Frontiers in Neuroscience  
A wide range of training methods for SNNs is presented, ranging from the conversion of conventional deep networks into SNNs, constrained training before conversion, spiking variants of backpropagation,  ...  Neuromorphic hardware platforms have great potential to enable deep spiking networks in real-world applications.  ...  ACKNOWLEDGMENTS We would like to thank David Stöckel, Volker Fischer, and Andre Guntoro for critical reading and helpful discussions.  ... 
doi:10.3389/fnins.2018.00774 pmid:30410432 pmcid:PMC6209684 fatcat:flcvj3c4tvfibhn2du3y6t3jvq

Training a digital model of a deep spiking neural network using backpropagation

V Bondarev, V. Breskich, A. Zheltenkov, Y. Dreizis
2020 E3S Web of Conferences  
The classification accuracy on test data for spiking neural network with 3 hidden layers is equal to 98.14%.  ...  Deep spiking neural networks are one of the promising eventbased sensor signal processing concepts.  ...  There are three main approaches to train deep SNNs: conversion of a trained conventional deep neural network to SNN [5] ; unsupervised learning based on local learning rules such as STDP [6] ; direct  ... 
doi:10.1051/e3sconf/202022401026 fatcat:5tm4h55iunbxblhnupsbi7gbjq

Layer-wise synapse optimization for implementing neural networks on general neuromorphic architectures

John Mern, Jayesh K. Gupta, Mykel J. Kochenderfer
2017 2017 IEEE Symposium Series on Computational Intelligence (SSCI)  
Deep artificial neural networks (ANNs) can represent a wide range of complex functions.  ...  Conventional ANNs must be converted into equivalent Spiking Neural Networks (SNNs) in order to be deployed on neuromorphic chips. This paper presents a way to perform this translation.  ...  ACKNOWLEDGMENT The authors would like to thank the U.S.  ... 
doi:10.1109/ssci.2017.8285202 dblp:conf/ssci/MernGK17 fatcat:qd4plzx25jfzzea7i5mowctmli
« Previous Showing results 1 — 15 out of 4,638 results