Filters








2,038 Hits in 6.7 sec

Solving the Spike Feature Information Vanishing Problem in Spiking Deep Q Network with Potential Based Normalization [article]

Yinqian Sun, Yi Zeng, Yang Li
2022 arXiv   pre-print
In this work, we mathematically analyzed the problem of the disappearance of spiking signal features in SDQN and proposed a potential based layer normalization(pbLN) method to directly train spiking deep  ...  deep Q Network (SDQN).  ...  In this work, we propose a potential based layer normalization method to solve the spike activity vanishing problem in SDQN.  ... 
arXiv:2206.03654v1 fatcat:veym2jrzqbdilj7d4vehsjuidy

Going Deeper With Directly-Trained Larger Spiking Neural Networks [article]

Hanle Zheng, Yujie Wu, Lei Deng, Yifan Hu, Guoqi Li
2020 arXiv   pre-print
But due to the binary spike activity of the firing function and the problem of gradient vanishing or explosion, current methods are restricted to shallow architectures and thereby difficult in harnessing  ...  Spiking neural networks (SNNs) are promising in a bio-plausible coding for spatio-temporal information and event-driven signal processing, which is very suited for energy-efficient implementation in neuromorphic  ...  So, we normalize the pre-activations to N (0, V 2 th ). Deep Spiking Residual Network ResNet is one of the most popular architectures o tackle with the problem of degradation when networks go deep.  ... 
arXiv:2011.05280v2 fatcat:kzerblzzf5eh3no7jmhdt2lyee

A New Spiking Convolutional Recurrent Neural Network (SCRNN) With Applications to Event-Based Hand Gesture Recognition

Yannan Xing, Gaetano Di Caterina, John Soraghan
2020 Frontiers in Neuroscience  
Rather than standard ANN to SNN conversion techniques, the network utilizes a supervised Spike Layer Error Reassignment (SLAYER) training mechanism that allows the network to adapt to neuromorphic (event-based  ...  In this paper, a novel spiking convolutional recurrent neural network (SCRNN) architecture that takes advantage of both convolution operation and recurrent connectivity to maintain the spatial and temporal  ...  The use of SLAYER effectively prevents the common gradient vanishing and explosion problem associated with recurrent neural networks.  ... 
doi:10.3389/fnins.2020.590164 pmid:33324153 pmcid:PMC7722478 fatcat:cyav4pefyvhvjck2swmoie5fye

Linear Leaky-Integrate-and-Fire Neuron Model Based Spiking Neural Networks and Its Mapping Relationship to Deep Neural Networks [article]

Sijia Lu, Feng Xu
2022 arXiv   pre-print
Spiking neural networks (SNNs) are brain-inspired machine learning algorithms with merits such as biological plausibility and unsupervised learning capability.  ...  It can serve as the theoretical basis for the potential combination of the respective merits of the two categories of neural networks.  ...  We generally adjust the spiking threshold to solve this problem. But we know that the shape of the action potential is essentially fixed, and the spiking threshold of neurons does not change.  ... 
arXiv:2207.04889v1 fatcat:7onmmma27rcm7doyrprg7tyfya

An optimised deep spiking neural network architecture without gradients [article]

Yeshwanth Bethi, Ying Xu, Gregory Cohen, Andre van Schaik, Saeed Afshar
2022 arXiv   pre-print
The proposed Optimized Deep Event-driven Spiking neural network Architecture (ODESA) can simultaneously learn hierarchical spatio-temporal features at multiple arbitrary time scales.  ...  Through these tests, we demonstrate ODESA can optimally solve practical and highly challenging hierarchical spatio-temporal learning tasks with the minimum possible number of computing nodes.  ...  Acknowledgement This research was supported by the Commonwealth of Australia through the NGTF Cyber Call 2019 and in collaboration with the Defence Science and Technology Group of the Department of Defence  ... 
arXiv:2109.12813v3 fatcat:lrufjssgwrf6dlvteiop55nrf4

Progressive Tandem Learning for Pattern Recognition with Deep Spiking Neural Networks [article]

Jibin Wu, Chenglin Xu, Daquan Zhou, Haizhou Li, Kay Chen Tan
2020 arXiv   pre-print
By studying the equivalence between ANNs and SNNs in the discrete representation space, a primitive network conversion method is introduced that takes full advantage of spike count to approximate the activation  ...  To compensate for the approximation errors arising from the primitive network conversion, we further introduce a layer-wise learning method with an adaptive training scheduler to fine-tune the network  ...  While discretizing the feature tensors derived from the first network layer can effectively preserve the information by leveraging the redundancies in the high-dimensional feature representation [35]  ... 
arXiv:2007.01204v1 fatcat:6cgw3yvezfdjvkwxjrzcuq4gra

Training Deep Spiking Auto-encoders without Bursting or Dying Neurons through Regularization [article]

Justus F. Hübotter, Pablo Lanillos, Jakub M. Tomczak
2021 arXiv   pre-print
Here, we apply end-to-end learning with membrane potential-based backpropagation to a spiking convolutional auto-encoder with multiple trainable layers of leaky integrate-and-fire neurons.  ...  However, training deep spiking neural networks, especially in an unsupervised manner, is challenging and the performance of a spiking model is significantly hindered by dead or bursting neurons.  ...  However, deep network structures are desirable to enable hierarchical information abstraction and integration.  ... 
arXiv:2109.11045v1 fatcat:tacibbyzerfohb3mmgnvznsod4

HF-SNN: High-frequency Spiking Neural Network

Jing Su, Jing Li
2021 IEEE Access  
However, SNN with binary input and output will lose much information and slightly inferior to deep neural networks (DNN). We consider how to make the most of information to protect input.  ...  INDEX TERMS Spiking neural network, high-frequency, deep learning I. INTRODUCTION  ...  Due to the non-continue function and the deep network, it is getting easier to meet the vanishing gradient problem. It is difficult and costly to train SNN from scratch.  ... 
doi:10.1109/access.2021.3068159 fatcat:butt7yujnbhszcz647yg4pt2oq

Training Deep Spiking Neural Networks Using Backpropagation

Jun Haeng Lee, Tobi Delbruck, Michael Pfeiffer
2016 Frontiers in Neuroscience  
Deep spiking neural networks (SNNs) hold the potential for improving the latency and energy efficiency of deep neural networks through data-driven event-based computation.  ...  Deep spiking neural networks (SNNs) hold the potential for improving the latency and energy efficiency of deep neural networks through data-driven event-based computation.  ...  Similarly, the generation of spikes in neuron i acts on its own membrane potential via the term a i , which is due to the reset in Equation (3) (normalized by V th ).  ... 
doi:10.3389/fnins.2016.00508 pmid:27877107 pmcid:PMC5099523 fatcat:uv2hg62gfbgqpmiy6n7i2mjkwm

Energy-Efficient Respiratory Anomaly Detection in Premature Newborn Infants

Ankita Paul, Md. Abu Saleh Tajin, Anup Das, William M. Mongan, Kapil R. Dandekar
2022 Electronics  
To improve accuracy while reducing the energy consumption, we propose a novel spiking neural network (SNN)-based respiratory classification solution, which can be implemented on event-driven neuromorphic  ...  We propose a five-stage design pipeline involving data collection and labeling, feature scaling, deep learning model selection with hyperparameter tuning, model training and validation, and model testing  ...  Informed Consent Statement: Not applicable. Conflicts of Interest: The authors declare no conflict of interest.  ... 
doi:10.3390/electronics11050682 fatcat:jdztc4lbebbbzo6rmyhsq2i754

Heterogeneous Ensemble-Based Spike-Driven Few-Shot Online Learning

Shuangming Yang, Bernabe Linares-Barranco, Badong Chen
2022 Frontiers in Neuroscience  
Spiking neural networks (SNNs) are regarded as a promising candidate to deal with the major challenges of current machine learning techniques, including the high energy consumption induced by deep neural  ...  In this paper, we propose a novel spike-based framework with the entropy theory, namely, heterogeneous ensemble-based spike-driven few-shot online learning (HESFOL).  ...  ACKNOWLEDGMENTS All authors would like to thank the editor and reviewer for their comments on this manuscript.  ... 
doi:10.3389/fnins.2022.850932 pmid:35615277 pmcid:PMC9124799 fatcat:walcddjcgjaydmuakhleclh32m

Toward Efficient Processing and Learning With Spikes: New Approaches for Multispike Learning

Qiang Yu, Shenglan Li, Huajin Tang, Longbiao Wang, Jianwu Dang, Kay Chen Tan
2020 IEEE Transactions on Cybernetics  
Spikes are the currency in central nervous systems for information transmission and processing.  ...  A simplified spiking neuron model is first introduced with the effects of both synaptic input and firing output on the membrane potential being modeled with an impulse function.  ...  describing the influence of spikes on membrane potential. κ(t) is causal and thus vanishes for t < 0.  ... 
doi:10.1109/tcyb.2020.2984888 pmid:32356771 fatcat:m2fu25jsirerbk742ch22algwi

Supervised Learning with First-to-Spike Decoding in Multilayer Spiking Neural Networks [article]

Brian Gardner, André Grüning
2020 arXiv   pre-print
Experimental studies support the notion of spike-based neuronal information processing in the brain, with neural circuits exhibiting a wide range of temporally-based coding strategies to rapidly and efficiently  ...  Motivated by this, we propose a new supervised learning method that can train multilayer spiking neural networks to solve classification problems based on a rapid, first-to-spike decoding strategy.  ...  Acknowledgments This research has received funding from the European Union's Horizon 2020 Framework Programme for Research and Innovation under the Specific Grant Agreement No. 785907 (Human Brain Project  ... 
arXiv:2008.06937v1 fatcat:slw4uyi6xfgjhjyzzzqgxbcdae

Deep Spiking Delayed Feedback Reservoirs and Its Application in Spectrum Sensing of MIMO-OFDM Dynamic Spectrum Sharing

Kian Hamedani, Lingjia Liu, Shiya Liu, Haibo He, Yang Yi
2020 PROCEEDINGS OF THE THIRTIETH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE AND THE TWENTY-EIGHTH INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE  
In this paper, we introduce a deep spiking delayed feedback reservoir (DFR) model to combine DFR with spiking neuros: DFRs are a new type of recurrent neural networks (RNNs) that are able to capture the  ...  The introduced deep spiking DFR model is used to capture the temporal correlation of the spectrum occupancy time series and predict the idle/busy subcarriers in future time slots for potential spectrum  ...  This work was supported by the U.S. National Science Foundation under grants ECCS 1731672, ECCS-1802710, ECCS-1811497, CNS-1811720, and CCF-1937487.  ... 
doi:10.1609/aaai.v34i02.5484 fatcat:p2js5knwvvhl5nu4c7r6ylomfm

Artificial Neural Networks-Based Machine Learning for Wireless Networks: A Tutorial [article]

Mingzhe Chen, Ursula Challita, Walid Saad, Changchuan Yin, and Mérouane Debbah
2019 arXiv   pre-print
For this purpose, we present a comprehensive overview on a number of key types of neural networks that include feed-forward, recurrent, spiking, and deep neural networks.  ...  In this context, this paper provides a comprehensive tutorial that introduces the main concepts of machine learning, in general, and artificial neural networks (ANNs), in particular, and their potential  ...  Several types of DNNs exist such as deep convolutional networks, deep RNNs, deep belief networks, deep feedforward networks, deep SNNs, deep Q-learning, deep ESN, deep residual network, and long-short  ... 
arXiv:1710.02913v2 fatcat:kljn2evlwba4fha4lpwxjpv4yu
« Previous Showing results 1 — 15 out of 2,038 results