Filters








1,429 Hits in 3.3 sec

On the Maximum Storage Capacity of the Hopfield Model

Viola Folli, Marco Leonetti, Giancarlo Ruocco
2017 Frontiers in Computational Neuroscience  
In past years, several works have been devoted to determine the maximum storage capacity of RNN, especially for the case of the Hopfield network, the most popular kind of RNN.  ...  In this paper, we study the storage performance of a generalized Hopfield model, where the diagonal elements of the connection matrix are allowed to be different from zero.  ...  storage capacity.  ... 
doi:10.3389/fncom.2016.00144 pmid:28119595 pmcid:PMC5222833 fatcat:iazouq4g3zbsvin7rdigdtkvdi

STORAGE CAPACITY OF EXTREMELY DILUTED HOPFIELD MODEL

BURCU AKCAN, YİĞİT GÜNDÜÇ
2004 International Journal of Modern Physics C  
It is also shown that the increase of the storage capacity of the diluted system depends on the storage capacity of the fully connected Hopfield Model and the fraction of the diluted synapses.  ...  The storage capacity of the extremely diluted Hopfield Model is studied by using Monte Carlo techniques.  ...  Aydın for the illuminating discussions and reading the manuscript.  ... 
doi:10.1142/s0129183104005802 fatcat:upuieqo6drgkpp3dnv4mwojysu

PERFORMANCE EVALUATION OF HOPFIELD ASSOCATIVE MEMORY FOR COMPRESSED IMAGES

Manu Pratap Singh, Dr. S. S Pandey, Vikas Pandey
2015 ELK Asia Pacific Journal of Computer Science and Information Systems  
Keywords: Hopfield Neural Networks, Associative memory, Compressed Images storage & recalling, pattern storage networks  ...  The maximum number of patterns successfully recalled by the above procedure is a pointer to the maximum storage capacity of the Hopfield Network, which is further discussed in the results.  ...  The storage capacity of a neural network refers to the maximum number of patterns that can be stored and successfully recalled for a given number of nodes, N.  ... 
doi:10.16962/eapjcsis/issn.2394-0441/20150930.v1i2.02 fatcat:rgg3elqjcbga5kde3mqgiie2si

Robustness and information capacity of learning rules for neutral network models

Th. Schnelle, A. Engel
1991 Physics Letters A  
For models based on the Hebbian rule it is shown that the information capacity becomes maximal if the synapses are restricted to the values (0, + 1 ) only. 0375-9601/91/$ 03.50 © 1991 -Elsevier Science  ...  Investigating different learning rules concerning their robustness against dilution and static noise we find a clear complementary relationship between robustness and high storage capacity.  ...  For comparison the storage capacity of the Hopfield model with cut small bounds (curve b) is included.  ... 
doi:10.1016/0375-9601(91)90128-u fatcat:fcuu2rhuyneeljt6yjpajivpn4

Implementation of Hopfield Neural Network for its Capacity with Finger Print Images

Ramesh Chandra, Somesh Kumar, Puneet Goswami
2016 International Journal of Computer Applications  
Performance is measured with respect to storage capacity; recall of distorted or noisy patterns. Here we test the accretive behavior of the Hopfield neural network.  ...  This paper analyzes the Hopfield neural network for storage and recall of fingerprint images.  ...  The maximum number of patterns successfully recalled by the above procedure is a pointer to the maximum storage capacity of the Hopfield Network, which is further discussed in the results.  ... 
doi:10.5120/ijca2016909625 fatcat:bw52jgwjx5cfvpfwmajswsfsly

Information capacity of a network of spiking neurons [article]

S. Scarpetta, A. de Candia
2019 arXiv   pre-print
of synapses, and find that it can reach a value α_max≃ 0.27, similar to the one of sequence processing neural networks, and almost double of the capacity of the static Hopfield model.  ...  We calculate the information capacity of the network, defined as the maximum number of patterns that can be encoded in the network times the number of bits carried by each pattern, normalized by the number  ...  ACKNOWLEDGEMENTS A.d.C. acknowledges financial support of the MIUR PRIN 2017WZFTZP "Stochastic forecasting in complex systems".  ... 
arXiv:1906.05584v2 fatcat:rs2vfy3kmrdazbk77jarca5bdu

QR Denoising using a Hopfield Network

2020 International journal of recent technology and engineering  
One of the biggest drawbacks of the noisy QR code is its poor performance and low storage capacity. Using Hopfield we can easily denoise the QR code and thereby increasing the storage capacity  ...  We propose an algorithm for denoising QR codes using the concept of parallel Hopfield neural network.  ...  Also, as stated by the Hebbian Rule of Learning the maximum capacity of a Hopfield associative network is similar to 0.15n [13] .  ... 
doi:10.35940/ijrte.e6062.018520 fatcat:cmlxzqc3snchvdscbmgbm2ulpq

Study on Order Batching Model Design Based on Hopfield Neural Network

Hong Zhang
2015 Science Journal of Business and Management  
In the paper, Hopfield Neural Network algorithm for sorting equipment were chosen to establish a capacity constraint order batching model which taking shortest path of all orders as the objective function  ...  and maximum equipment utilization order batching model.  ...  Acknowledgements This paper is funded by the project of National Natural Science Fund, Logistics distribution of artificial order picking random process model analysis and research(Project number: 71371033  ... 
doi:10.11648/j.sjbm.20150302.12 fatcat:ufrlj7b6cfautijhd3ryqylmha

Dynamics and robustness of familiarity memory [article]

J.M. Cortes, A. Greve, A.B. Barrett, M.C.W. van Rossum
2007 arXiv   pre-print
Following previous computational models of familiarity memory we investigate the dynamical properties of familiarity discrimination, and contrast two different familiarity discriminators: one based on  ...  For both discriminators we establish, via a combined method of signal-to-noise ratio and mean field analysis, how the maximum number of successfully discriminated stimuli depends on the noise level.  ...  Acknowledgments The authors acknowledge Rafal Bogacz (Univ. Bristol) and David Donaldson  ... 
arXiv:0710.1333v1 fatcat:bndgcimqzngdddycofmrhml7me

Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study

Do-Hyun Kim, Jinha Park, Byungnam Kahng, Constantine Dovrolis
2017 PLoS ONE  
The analytical solution of the model in mean field limit revealed that memories can be retrieved without any error up to a finite storage capacity of O(N), where N is the system size.  ...  Motivated by this observation, we consider the Hopfield model on scale-free networks and obtain a different pattern of associative memory retrieval from that obtained on the fully connected network: the  ...  Therefore, the storage capacity a c is the maximum value of storage rate a. Detailed calculations for those parameters are presented in Section 4 of the S1 File. Phase diagram and error rates.  ... 
doi:10.1371/journal.pone.0184683 pmid:29077721 pmcid:PMC5659639 fatcat:euebqosjybf2xgschqcu3pg5au

"Unlearning" increases the storage capacity of content addressable memories

D. Kleinfeld, D.B. Pendergraft
1987 Biophysical Journal  
The storage and retrieval of information in networks of biological neurons can be modeled by certain types of content addressable memories (CAMs).  ...  Mechanisms for the increase in capacity are identified and illustrated in terms of an energy function that describes the convergence properties of the network.  ...  The dependence of the storage capacity on the number of unlearning trials, m, was studied for fixed values of e.  ... 
doi:10.1016/s0006-3495(87)83310-6 pmid:3801583 pmcid:PMC1329862 fatcat:genlp7m6mnf4ncqagqd7d6z65u

Beyond the Maximum Storage Capacity Limit in Hopfield Recurrent Neural Networks

Giorgio Gosti, Viola Folli, Marco Leonetti, Giancarlo Ruocco
2019 Entropy  
This paper shows how autapses together with stable state redundancy can improve the storage capacity of a recurrent neural network.  ...  This thus limits the potential use of this kind of Hopfield network as an associative memory.  ...  Conflicts of Interest: The authors declare no conflict of interest. Abbreviations The following abbreviations are used in this manuscript: RNN Recurrent Neural Network  ... 
doi:10.3390/e21080726 pmid:33267440 fatcat:biectlv2ffec5mzkwklejc3g6y

QR code denoising using parallel Hopfield networks [article]

Ishan Bhatnagar, Shubhang Bhatnagar
2018 arXiv   pre-print
One of the major drawbacks in their use in noise tolerant associative memory is their low capacity of storage, scaling only linearly with the number of nodes in the network.  ...  Our paper proposes a new algorithm to allow the use of several Hopfield networks in parallel thereby increasing the cumulative storage capacity of the system many times as compared to a single Hopfield  ...  A single network with the same storage capacity would require a storage of the order (3249×n) 2 terms.  ... 
arXiv:1812.01065v2 fatcat:u3ir3ukdpnat5f355qglw5dhye

An Introduction to Quaternion-Valued Recurrent Projection Neural Networks [article]

Marcos Eduardo Valle, Rodolfo Anibal Lobo
2019 arXiv   pre-print
Furthermore, computational experiments reveal that QRPNNs exhibit greater storage capacity and noise tolerance than their corresponding QRCNNs.  ...  We show that QRPNNs overcome the cross-talk problem of QRCNNs. Thus, they are appropriate to implement associative memories.  ...  The projection rule increases the storage capacity of the Hopfield network to n − 1 items.  ... 
arXiv:1909.09227v1 fatcat:zoegdaub2fftxe7sm2wotgpace

Neural Network Capacity for Multilevel Inputs [article]

Matt Stowe, Subhash Kak
2013 arXiv   pre-print
New learning strategies are proposed to increase Hopfield network capacity, and the scalability of these methods is also examined in respect to size of the network.  ...  This paper examines the memory capacity of generalized neural networks.  ...  ACKNOWLEDGEMENTS The authors would like to acknowledge the Oklahoma State University High Performance Computing Center for the use of the Cowboy supercomputer in this research.  ... 
arXiv:1307.8104v1 fatcat:merr2oe52nhobdmijn7tm2pvoq
« Previous Showing results 1 — 15 out of 1,429 results