Training deep neural networks for binary communication with the Whetstone method

William Severa, Craig M. Vineyard, Ryan Dellana, Stephen J. Verzi, James B. Aimone
<span title="2019-01-28">2019</span> <i title="Springer Nature"> <a target="_blank" rel="noopener" href="https://fatcat.wiki/container/v66j35cgxvajrnw3y4tkpw4ine" style="color: black;">Nature Machine Intelligence</a> </i> &nbsp;
This paper presents a new technique for training networks for low-precision communication. Targeting minimal communication between nodes not only enables the use of emerging spiking neuromorphic platforms, but may additionally streamline processing conventionally. Low-power and embedded neuromorphic processors potentially offer dramatic performance-per-Watt improvements over traditional von Neumann processors, however programming these brain-inspired platforms generally requires
more &raquo; ... c expertise which limits their applicability. To date, the majority of artificial neural networks have not operated using discrete spike-like communication. We present a method for training deep spiking neural networks using an iterative modification of the backpropagation optimization algorithm. This method, which we call Whetstone, effectively and reliably configures a network for a spiking hardware target with little, if any, loss in performance. Whetstone networks use single time step binary communication and do not require a rate code or other spike-based coding scheme, thus producing networks comparable in timing and size to conventional ANNs, albeit with binarized communication. We demonstrate Whetstone on a number of image classification networks, describing how the sharpening process interacts with different training optimizers and changes the distribution of activity within the network. We further note that Whetstone is compatible with several non-classification neural network applications, such as autoencoders and semantic segmentation. Whetstone is widely extendable and currently implemented using custom activation functions within the Keras wrapper to the popular TensorFlow machine learning framework.
<span class="external-identifiers"> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1038/s42256-018-0015-y">doi:10.1038/s42256-018-0015-y</a> <a target="_blank" rel="external noopener" href="https://fatcat.wiki/release/pcoqu5ekibbcjj4q2mropjjlfe">fatcat:pcoqu5ekibbcjj4q2mropjjlfe</a> </span>
<a target="_blank" rel="noopener" href="https://web.archive.org/web/20200913104752/https://arxiv.org/pdf/1810.11521v1.pdf" title="fulltext PDF download [not primary version]" data-goatcounter-click="serp-fulltext" data-goatcounter-title="serp-fulltext"> <button class="ui simple right pointing dropdown compact black labeled icon button serp-button"> <i class="icon ia-icon"></i> Web Archive [PDF] <span style="color: #f43e3e;">&#10033;</span> <div class="menu fulltext-thumbnail"> <img src="https://blobs.fatcat.wiki/thumbnail/pdf/e9/8f/e98f5b2e6f1a1bae98705b95d959623d0ef845e6.180px.jpg" alt="fulltext thumbnail" loading="lazy"> </div> </button> </a> <a target="_blank" rel="external noopener noreferrer" href="https://doi.org/10.1038/s42256-018-0015-y"> <button class="ui left aligned compact blue labeled icon button serp-button"> <i class="external alternate icon"></i> nature.com </button> </a>