A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2020; you can also visit the original URL.
The file type is application/pdf
.
Balancing the learning ability and memory demand of a perceptron-based dynamically trainable neural network
2018
Journal of Supercomputing
Artificial neural networks (ANNs) have become a popular means of solving complex problems in prediction based applications such as image and natural language processing. Two challenges prominent in the neural network domain are the practicality of hardware implementation and dynamically training the network. In this study we address these challenges with a development methodology that balances the hardware footprint and the quality of the ANN. We use the well-known perceptron-based branch
doi:10.1007/s11227-018-2374-x
fatcat:nuirrf47erc4lamcbiby7ul75m