Filters








275 Hits in 4.8 sec

SPICEprop: Backpropagating Errors Through Memristive Spiking Neural Networks [article]

Peng Zhou, Jason K. Eshraghian, Dong-Uk Choi, Sung-Mo Kang
2022 arXiv   pre-print
The natural spiking dynamics of the MIF neuron model are fully differentiable, eliminating the need for gradient approximations that are prevalent in the spiking neural network literature.  ...  We present a fully memristive spiking neural network (MSNN) consisting of novel memristive neurons trained using the backpropagation through time (BPTT) learning rule.  ...  By unrolling the computational graph of spiking neuron models, SNNs can take advantage of the backpropagation through time (BPTT) algorithm [5] such that a global error is calculated in the final layer  ... 
arXiv:2203.01426v3 fatcat:bvii3xz73fdopbmoddxwa2d2qu

Learning neural connectivity from firing activity: efficient algorithms with provable guarantees on topology

Amin Karbasi, Amir Hesam Salavati, Martin Vetterli
2018 Journal of Computational Neuroscience  
In this paper, we formulate the neural network reconstruction as an instance of a graph learning problem, where we observe the behavior of nodes/neurons (i.e., firing activities) and aim to find the links  ...  We then validate the performance of the algorithm using artificially generated data (for benchmarking) and real data recorded from multiple hippocampal areas in rats.  ...  György Buzsáki for generously sharing their recorded data from hippocampal areas in rats (Mizuseki et al. 2013; Mizuseki et al. 2009 ).  ... 
doi:10.1007/s10827-018-0678-8 pmid:29464489 pmcid:PMC5851696 fatcat:tyf6zrzzwrcijbddzpjxlq6uie

An Analytic Solution to the Inverse Ising Problem in the Tree-reweighted Approximation [article]

Takashi Sano
2018 arXiv   pre-print
In an experiment to reconstruct the interaction matrix, we found that the proposed formula returns the best estimates in strongly-attractive regions for various graph structures.  ...  When applied to finding the connectivity of neurons from spike train data, the proposed formula gave the closest result to that obtained by a gradient ascent algorithm, which typically requires thousands  ...  ACKNOWLEDGMENT The author is grateful to Yuuji Ichisugi and anonymous reviewers for comments.  ... 
arXiv:1805.11452v1 fatcat:n6j4a5tulzb2zo7tpbinnrzuvu

Genesis of Organic Computing Systems: Coupling Evolution and Learning [chapter]

Christian Igel, Bernhard Sendhoff
2009 Understanding Complex Systems  
In organic computing, simulated evolutionary structure optimization can create artificial neural networks for particular environments.  ...  In this chapter, trends and recent results in combining evolutionary and neural computation are reviewed.  ...  Acknowledgments Christian Igel acknowledges support from the German Federal Ministry of Education and Research within the Bernstein group "The grounding of higher brain function in dynamic neural fields  ... 
doi:10.1007/978-3-540-77657-4_7 fatcat:v54a6do2zzhvhm2677wgpz6vhe

Computing with Spiking Neuron Networks [chapter]

Hélène Paugam-Moisy, Sander Bohte
2012 Handbook of Natural Computing  
Spiking Neuron Networks (SNNs) are often referred to as the 3 rd generation of neural networks.  ...  The computational power of SNNs is addressed in Section 3 and the problem of learning in networks of spiking neurons is tackled in Section 4, with insights into the tracks currently explored for solving  ...  provable learning results for spiking neurons.  ... 
doi:10.1007/978-3-540-92910-9_10 fatcat:uixmvc27zjgirg5oobqaqwccpe

STDP Installs in Winner-Take-All Circuits an Online Approximation to Hidden Markov Model Learning

David Kappel, Bernhard Nessler, Wolfgang Maass, Henning Sprekeler
2014 PLoS Computational Biology  
In fact, one can show that these motifs endow cortical microcircuits with functional properties of a hidden Markov model, a generic model for solving such tasks through probabilistic inference.  ...  This capability emerges in the presence of noise automatically through effects of STDP on connections between pyramidal cells in Winner-Take-All circuits with lateral excitation.  ...  For the HMM, this approach results in an instantiation of the EM algorithm [19, 30] in a network of spiking neurons (stochastic WTA circuit).  ... 
doi:10.1371/journal.pcbi.1003511 pmid:24675787 pmcid:PMC3967926 fatcat:fy7nvgo235hytdkkflp72fnlva

Neuromorphic implementations of neurobiological learning algorithms for spiking neural networks

Florian Walter, Florian Röhrbein, Alois Knoll
2015 Neural Networks  
The paper thus gives an in-depth overview of neuromorphic implementations of basic mechanisms of synaptic plasticity which are required to realize advanced cognitive capabilities with spiking neural networks  ...  Enabling higher cognitive functions for neurorobotics consequently requires the application of neurobiological learning algorithms to adjust synaptic weights in a biologically plausible way.  ...  Acknowledgments We thank the anonymous reviewers for their valuable feedback.  ... 
doi:10.1016/j.neunet.2015.07.004 pmid:26422422 fatcat:qbpqali565blfphyprdvy6o3lm

Instantaneous Cross-Correlation Analysis of Neural Ensembles with High Temporal Resolution [chapter]

António R.C. Paiva, Il Park, José C. Príncipe, Justin C. Sanchez
2013 Introduction to Neural Engineering for Motor Rehabilitation  
Over the last decade several positive definite kernels have been proposed to treat spike trains as objects in Hilbert space.  ...  However, for the most part, such attempts still remain a mere curiosity for both computational neuroscientists and signal processing experts.  ...  The KLMS has been applied with advantages in nonlinear signal processing of continuous amplitude signals, mostly using the Gaussian kernel, but one of the advantages of the RKHS approach is that the algorithm  ... 
doi:10.1002/9781118628522.ch10 fatcat:qwnlvtl45rgsnpuag6c6tanksy

Kernel Methods on Spike Train Space for Neuroscience: A Tutorial

Il Memming Park, Sohan Seth, Antonio R.C. Paiva, Lin Li, Jose C. Principe
2013 IEEE Signal Processing Magazine  
Over the last decade several positive definite kernels have been proposed to treat spike trains as objects in Hilbert space.  ...  However, for the most part, such attempts still remain a mere curiosity for both computational neuroscientists and signal processing experts.  ...  The KLMS has been applied with advantages in nonlinear signal processing of continuous amplitude signals, mostly using the Gaussian kernel, but one of the advantages of the RKHS approach is that the algorithm  ... 
doi:10.1109/msp.2013.2251072 fatcat:lxwgwxt7nzhazg5hroo254bdve

Solving Constraint Satisfaction Problems with Networks of Spiking Neurons

Zeno Jonke, Stefan Habenschuss, Wolfgang Maass
2016 Frontiers in Neuroscience  
Furthermore, networks of spiking neurons carry out for the Traveling Salesman Problem a more efficient stochastic search for good solutions compared with stochastic artificial neural networks (Boltzmann  ...  We present here a new method for designing networks of spiking neurons via an energy function.  ...  ACKNOWLEDGMENTS We would like to thank Dejan Pecevski for giving us advance access to the event-based neural simulator NEVESIM (Pecevski et al., 2014) , that we used for all our simulations.  ... 
doi:10.3389/fnins.2016.00118 pmid:27065785 pmcid:PMC4811945 fatcat:ecdskaa5f5h2hbjhiu3wr5rqrq

Large-Scale Synthesis of Functional Spiking Neural Circuits

Terrence C. Stewart, Chris Eliasmith
2014 Proceedings of the IEEE  
KEYWORDS | Neural computation; neural engineering framework (NEF); neural modeling; neuromorphic engineering; semantic pointer architecture (SPA); Spaun; spiking neural networks 0018-9219  ...  We achieve this through the neural engineering framework (NEF), a mathematical theory that provides methods for systematically generating biologically plausible spiking networks to implement nonlinear  ...  THE SEMANTIC POINTER ARCHITECTURE While the NEF specifies how to convert vector-based algorithms into spiking neural networks, a separate theory is needed to describe cognitive function in terms of vectorbased  ... 
doi:10.1109/jproc.2014.2306061 fatcat:bclajcxbmzgp5bqeageaek5leu

Stochastic Optimal Control as a Theory of Brain-Machine Interface Operation

Manuel Lagang, Lakshminarayan Srinivasan
2013 Neural Computation  
Spike binning may erode performance in part from intrinsic control-dependent constraints, regardless of decoding accuracy.  ...  Here we propose a compact model based on stochastic optimal control to describe the brain in skillfully operating canonical decoding algorithms.  ...  We thank David Rutledge at the California Institute of Technology and Sanjoy Mitter at MIT for institutional support.  ... 
doi:10.1162/neco_a_00394 pmid:23148413 fatcat:37siipxgazfbhfkubxnidtfkne

2022 Roadmap on Neuromorphic Computing and Engineering [article]

Dennis V. Christensen, Regina Dittmann, Bernabé Linares-Barranco, Abu Sebastian, Manuel Le Gallo, Andrea Redaelli, Stefan Slesazeck, Thomas Mikolajick, Sabina Spiga, Stephan Menzel, Ilia Valov, Gianluca Milano (+47 others)
2022 arXiv   pre-print
We hope that this Roadmap will be a useful resource to readers outside this field, for those who are just entering the field, and for those who are well established in the neuromorphic community. https  ...  technology, namely materials, devices, neuromorphic circuits, neuromorphic algorithms, applications, and ethics.  ...  Acknowledgements The author thanks the CNRS for support. Acknowledgements The author would like to thank E. Donati for fun and insightful discussions and brainstorming on the topic.  ... 
arXiv:2105.05956v3 fatcat:pqir5infojfpvdzdwgmwdhsdi4

A theoretical basis for efficient computations with noisy spiking neurons [article]

Zeno Jonke, Stefan Habenschuss, Wolfgang Maass
2014 arXiv   pre-print
Furthermore, one can demonstrate for the Traveling Salesman Problem a surprising computational advantage of networks of spiking neurons compared with traditional artificial neural networks and Gibbs sampling  ...  We present here a new theoretical framework for organizing computations of networks of spiking neurons.  ...  Acknowledgments We would like to thank Dejan Pecevski for givinig us advance access to the event-based neural simulator NEVESIM [33] , that we used for all our simulations.  ... 
arXiv:1412.5862v1 fatcat:qfg7wp4b75hllkoye2x45zt4ei

Equilibrium Propagation: Bridging the Gap between Energy-Based Models and Backpropagation

Benjamin Scellier, Yoshua Bengio
2017 Frontiers in Computational Neuroscience  
In the case of a recurrent multi-layer supervised network, the output units are slightly nudged toward their target in the second phase, and the perturbation introduced at the output layer propagates backward  ...  Although this algorithm computes the gradient of an objective function just like Backpropagation, it does not need a special computation or circuit for the second phase, where errors are implicitly propagated  ...  NSERC, CIFAR, Samsung and Canada Research Chairs for funding, and Compute Canada for computing resources.  ... 
doi:10.3389/fncom.2017.00024 pmid:28522969 pmcid:PMC5415673 fatcat:qgsbd2juzrfy5gchn2dnfhwwuu
« Previous Showing results 1 — 15 out of 275 results