A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Spatially Arranged Sparse Recurrent Neural Networks for Energy Efficient Associative Memory
2019
IEEE Transactions on Neural Networks and Learning Systems
The development of hardware neural networks, including neuromorphic hardware, has been accelerated over the past few years. However, it is challenging to operate very large-scale neural networks with low-power hardware devices, partly due to signal transmissions through a massive number of interconnections. Our aim is to deal with the issue of communication cost from an algorithmic viewpoint and study learning algorithms for energy-efficient information processing. Here, we consider two
doi:10.1109/tnnls.2019.2899344
pmid:30892239
fatcat:idjzdwe665aodl7wdpz5jqftx4