Robust Exponential Memory in Hopfield Networks

Christopher J. Hillar, Ngoc M. Tran
<span title="2018-01-16">2018</span> <i title="Springer Nature"> <a target="_blank" rel="noopener" href="" style="color: black;">Journal of Mathematical Neuroscience</a> </i> &nbsp;
which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. Abstract The Hopfield recurrent neural network is a classical auto-associative model of memory, in which collections of symmetrically coupled McCulloch-Pitts binary neurons interact to perform emergent computation. Although previous researchers have explored
more &raquo; ... he potential of this network to solve combinatorial optimization problems or store reoccurring activity patterns as attractors of its deterministic dynamics, a basic open problem is to design a family of Hopfield networks with a number of noise-tolerant memories that grows exponentially with neural population size. Here, we discover such networks by minimizing probability flow, a recently proposed objective for estimating parameters in discrete maximum entropy models. By descending the gradient of the convex probability flow, our networks adapt synaptic weights to achieve robust exponential storage, even when presented with vanishingly small numbers of training patterns. In addition to providing a new set of low-density errorcorrecting codes that achieve Shannon's noisy channel bound, these networks also efficiently solve a variant of the hidden clique problem in computer science, opening new avenues for real-world applications of computational models originating from biology. Abbreviations OPR outer-product rule MPF minimum probability flow 1 Introduction Discovered first by Pastur and Figotin [1] as a simplified spin glass [2] in statistical physics, the Hopfield model [3] is a recurrent network of n linear threshold McCulloch neurons that can store n/(4 ln n) binary patterns [5] as distributed "memories" in the form of auto-associative fixed-point attractors. While several aspects of these networks appeared earlier (see, e.g., [6] for dynamics and learning), the approach nonetheless introduced ideas from physics into the theoretical study of neural computation. The Hopfield model and its variants have been studied intensely in theoretical neuroscience and statistical physics [7], but investigations into its utility for memory and coding have mainly focused on storing collections of patterns X using a "one-shot" outer-product rule (OPR) for learning, which essentially assigns abstract synaptic weights between neurons to be their correlation, an early idea in neuroscience [8, 9] . Independent of learning, at most 2n randomly generated dense patterns can be simultaneously stored in networks with n neurons [10] . Despite this restriction, super-linear capacity in Hopfield networks is possible for special pattern classes and connectivity structures. For instance, if patterns to memorize contain many zeros, it is possible to store nearly a quadratic number [11] . Other examples are random networks, which have ≈1.22 n attractors asymptotically [12] , and networks storing all permutations [13] . In both examples of exponential storage, however, memories have vanishingly small basins of attraction, making them ill-suited for noise-tolerant pattern storage. Interestingly, the situation is even worse for networks storing permutations: any Hopfield network storing permutations will not recover the derangements (more than a third of all permutations) from asymptotically vanishing noise (see Theorem 4, proved in Sect. 5). In this note, we design a family of sparsely connected n-node Hopfield networks with (asymptotically, as n → ∞) robustly stored fixed-point attractors by minimizing "probability flow" [14, 15] . To our knowledge, this is the first rigorous demonstration of super-polynomial noisetolerant storage in recurrent networks of simple linear threshold elements. The approach also provides a normative, convex, biologically plausible learning mechanism for discovering these networks from small amounts of data and reveals new connections between binary McCulloch-Pitts neural networks, efficient error-correcting codes, and computational graph theory.
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="">doi:10.1186/s13408-017-0056-2</a> <a target="_blank" rel="external noopener" href="">pmid:29340803</a> <a target="_blank" rel="external noopener" href="">pmcid:PMC5770423</a> <a target="_blank" rel="external noopener" href="">fatcat:nbbhmyyb5nh3rc4dxbzyal272q</a> </span>
<a target="_blank" rel="noopener" href="" title="fulltext PDF download" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <div class="menu fulltext-thumbnail"> <img src="" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href=""> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="unlock alternate icon" style="background-color: #fb971f;"></i> </button> </a> <a target="_blank" rel="external noopener" href="" title="pubmed link"> <button class="ui compact blue labeled icon button serp-button"> <i class="file alternate outline icon"></i> </button> </a>