Filters








383 Hits in 5.6 sec

Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks

Giorgio Gosti, Viola Folli, Marco Leonetti, Giancarlo Ruocco
2019 Entropy  
This paper shows how autapses together with stable state redundancy can improve the storage capacity of a recurrent neural network.  ...  Recent research shows how, in an N-node Hopfield neural network with autapses, the number of stored patterns (P) is not limited to the well known bound 0.14 N , as it is for networks without autapses.  ...  Conflicts of Interest: The authors declare no conflict of interest. Abbreviations The following abbreviations are used in this manuscript: RNN Recurrent Neural Network  ... 
doi:10.3390/e21080726 pmid:33267440 fatcat:biectlv2ffec5mzkwklejc3g6y

Effect of dilution in asymmetric recurrent neural networks

Viola Folli, Giorgio Gosti, Marco Leonetti, Giancarlo Ruocco
2018 Neural Networks  
These attractors form the set of all the possible limit behaviors of the neural network.  ...  We study with numerical simulation the possible limit behaviors of synchronous discrete-time deterministic recurrent neural networks composed of N binary neurons as a function of a network's level of dilution  ...  Conflict of Interest Statement The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest  ... 
doi:10.1016/j.neunet.2018.04.003 pmid:29705670 fatcat:ddp3ohi5krgf5c4gk3c7iiemba

Tolerance of Pattern Storage Network for Storage and Recalling of Compressed Image using SOM

M. P.Singh, Rinku Sharma Dixit
2013 International Journal of Computer Applications  
This study also explores the tolerance in Hopfield neural network for reducing the effect of false minimas in the recalling process.  ...  In this paper we are studying the tolerance of Hopfield neural network for storage and recalling of fingerprint images. The feature extraction of these images is performed with FFT, DWT and SOM.  ...  This learning rule exhibits the following limitations: The maximum capacity with binary input, as suggested by Hopfield, is limited to just 0.15N, if small errors in recalling are allowed.  ... 
doi:10.5120/12234-8516 fatcat:sbslpruqz5e5fi7paumatm3uru

On the Maximum Storage Capacity of the Hopfield Model

Viola Folli, Marco Leonetti, Giancarlo Ruocco
2017 Frontiers in Computational Neuroscience  
In past years, several works have been devoted to determine the maximum storage capacity of RNN, especially for the case of the Hopfield network, the most popular kind of RNN.  ...  Recurrent neural networks (RNN) have traditionally been of great interest for their capacity to store memories.  ...  storage capacity.  ... 
doi:10.3389/fncom.2016.00144 pmid:28119595 pmcid:PMC5222833 fatcat:iazouq4g3zbsvin7rdigdtkvdi

A Three-Threshold Learning Rule Approaches the Maximal Capacity of Recurrent Neural Networks

Alireza Alemi, Carlo Baldassi, Nicolas Brunel, Riccardo Zecchina, Peter E. Latham
2015 PLoS Computational Biology  
Here, by transforming the perceptron learning rule, we present an online learning rule for a recurrent neural network that achieves near-maximal storage capacity without an explicit supervisory error signal  ...  A popular theoretical scenario for modeling memory function is the attractor neural network scenario, whose prototype is the Hopfield model.  ...  In particular, the Hopfield network [1] that uses a Hebbian learning rule has a storage capacity of 0.138N in the limit of N ! 1 [15] .  ... 
doi:10.1371/journal.pcbi.1004439 pmid:26291608 pmcid:PMC4546407 fatcat:x6uucdwufvcxpnyx564pmtgaoe

Biological learning in key-value memory networks [article]

Danil Tyulmankov, Ching Fang, Annapurna Vadaparty, Guangyu Robert Yang
2021 arXiv   pre-print
In neuroscience, classical Hopfield networks are the standard biologically plausible model of long-term memory, relying on Hebbian plasticity for storage and attractor dynamics for recall.  ...  In contrast, memory-augmented neural networks in machine learning commonly use a key-value mechanism to store and read out memories in a single step.  ...  Research supported by NSF NeuroNex Award DBI-1707398, the Gatsby Charitable Foundation, and the Simons Collaboration for the Global Brain.  ... 
arXiv:2110.13976v1 fatcat:ekjopjyhabfwbdt7ua5oq267bm

High storage capacity in the Hopfield model with auto-interactions—stability analysis

Jacopo Rocchi, David Saad, Daniele Tantari
2017 Journal of Physics A: Mathematical and Theoretical  
Recent studies point to the potential storage of a large number of patterns in the celebrated Hopfield associative memory model, well beyond the limits obtained previously.  ...  We investigate the properties of new fixed points to discover that they exhibit instabilities for small perturbations and are therefore of limited value as associative memories.  ...  Acknowledgement Support from The Leverhulme Trust grant RPG-2013-48 is acknowledged. We wish to thank Pierfrancesco Urbani for insightful discussions.  ... 
doi:10.1088/1751-8121/aa8fd7 fatcat:r6vyqogkxjduvbts5y26rlisaa

Robust computation with rhythmic spike patterns

E. Paxon Frady, Friedrich T. Sommer
2019 Proceedings of the National Academy of Sciences of the United States of America  
Building on Hebbian neural associative memories, like Hopfield networks, we first propose threshold phasor associative memory (TPAM) networks.  ...  Second, we construct 2 spiking neural networks to approximate the complex algebraic computations in TPAM, a reductionist model with resonate-and-fire neurons and a biologically plausible network of integrate-and-fire  ...  We thank Pentti Kanerva and members of the Redwood Center for valuable feedback.  ... 
doi:10.1073/pnas.1902653116 pmid:31431524 pmcid:PMC6731666 fatcat:urhs462ppzdvnfjijhn7cchdea

Index

1992 Neural Computation  
Seeing Beyond the Nyquist Limit (Letter) 4(5):682-690 Sajda, P. — See Finkel, L. H. Schillen, T. — See Konig, P. Schmidhuber, J.  ...  On the Information Storage Capacity of Local Learning Rules (Letter) 4(5):703-711 Peterson, C. — See Gislen, L. Quinn, R. D. — See Beer, R. D. Rapp, M., Yarom, Y., and Segey, I.  ... 
doi:10.1162/neco.1992.4.6.961 fatcat:75erbfoc7ja7pnjst4hv4a5vlm

Dreaming neural networks: forgetting spurious memories and reinforcing pure ones [article]

Alberto Fachechi, Elena Agliari, Adriano Barra
2018 arXiv   pre-print
The standard Hopfield model for associative neural networks accounts for biological Hebbian learning and acts as the harmonic oscillator for pattern recognition, however its maximal storage capacity is  ...  In particular, beyond obtaining a phase diagram for neural dynamics, we focus on synaptic plasticity and we give explicit prescriptions on the temporal evolution of the synaptic matrix.  ...  The blue dots represent the storage capacity beyond which the only possible solution has µ = 0, i.e. the end-points of the curves in the left plot).  ... 
arXiv:1810.12217v1 fatcat:fsowutitrjhjlfijwwmftdqj3u

Robust Exponential Memory in Hopfield Networks

Christopher J. Hillar, Ngoc M. Tran
2018 Journal of Mathematical Neuroscience  
Abstract The Hopfield recurrent neural network is a classical auto-associative model of memory, in which collections of symmetrically coupled McCulloch-Pitts binary neurons interact to perform emergent  ...  To our knowledge, this is the first rigorous demonstration of super-polynomial noisetolerant storage in recurrent networks of simple linear threshold elements.  ...  All authors read and approved the final manuscript. Publisher's Note Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.  ... 
doi:10.1186/s13408-017-0056-2 pmid:29340803 pmcid:PMC5770423 fatcat:nbbhmyyb5nh3rc4dxbzyal272q

Recurrent correlation associative memories

T.-D. Chiueh, R.M. Goodman
1991 IEEE Transactions on Neural Networks  
Furthermore, the asymptotic storage capacity of the ECAM with limited dynamic range in its exponentiation nodes is found to be proportional to that dynamic range.  ...  Since they are based on two-layer recurrent neural networks and their operations depend on the correlation measure, we call these associative memories recurrent correlation associative memories (RCAM's  ...  INTRODUCTION S INCE the seminal work of Hopfield [1], [2], there has been much interest in building associative memories using neural network approaches.  ... 
doi:10.1109/72.80338 pmid:18276381 fatcat:ufqdbk7qqfhaflktzwyzl2bi6m

Content addressable memory without catastrophic forgetting by heteroassociation with a fixed scaffold [article]

Sugandha Sharma, Sarthak Chandra, Ila R. Fiete
2022 arXiv   pre-print
patterns below capacity and a 'memory cliff' beyond, such that inserting a single additional pattern results in catastrophic forgetting of all stored patterns.  ...  We show analytically and experimentally that MESH nearly saturates the total information bound (given by the number of synapses) for CAM networks, invariant of the number of stored patterns, outperforming  ...  In this network, storage of information-dense patterns up to a critical capacity results in complete recovery of all patterns and storage of a larger number of patterns results in partial reconstruction  ... 
arXiv:2202.00159v2 fatcat:yjteef22wrfuxi7s6mllfpl6hm

A new design method for the complex-valued multistate hopfield associative memory

M.K. Muezzinoglu, C. Guzelis, J.M. Zurada
2003 IEEE Transactions on Neural Networks  
Maximum number of integral vectors that can be embedded into the energy landscape of the network by this method is investigated by computer experiments.  ...  Based on the solution of this system, it gives a recurrent network of multistate neurons with complex and symmetric synaptic weights, which operates on the finite state space 1 2 . . . to minimize this  ...  Such a network will be called Hermitian hereafter. Several design procedures that employ inequalities in the design of recurrent neural networks have been reported, e.g., [12] - [14] .  ... 
doi:10.1109/tnn.2003.813844 pmid:18238068 fatcat:mhhy23b5lfd6xpdfdrouctdaqy

Nonequilibrium landscape theory of neural networks

H. Yan, L. Zhao, L. Hu, X. Wang, E. Wang, J. Wang
2013 Proceedings of the National Academy of Sciences of the United States of America  
The Hopfield model shows us a clear dynamical picture of how neural circuits implement their memory storage and retrieval functions.  ...  Hopfield quantified the learning and memory process of symmetrically connected neural networks globally through equilibrium energy.  ...  In other words, the Lyapunov function characterizes the global behavior of not only symmetric but also the asymmetric neural networks.  ... 
doi:10.1073/pnas.1310692110 pmid:24145451 pmcid:PMC3831465 fatcat:wjhzfo6fa5bn3byml2dlcwu33a
« Previous Showing results 1 — 15 out of 383 results