Balancing the learning ability and memory demand of a perceptron-based dynamically trainable neural network

Edward Richter, Spencer Valancius, Josiah McClanahan, John Mixter, Ali Akoglu
2018 Journal of Supercomputing  
Artificial neural networks (ANNs) have become a popular means of solving complex problems in prediction based applications such as image and natural language processing. Two challenges prominent in the neural network domain are the practicality of hardware implementation and dynamically training the network. In this study we address these challenges with a development methodology that balances the hardware footprint and the quality of the ANN. We use the well-known perceptron-based branch
more » ... tion problem as a case study for demonstrating this methodology. This problem is perfect to analyze dynamic hardware implementations of ANNs because it exists in hardware and trains dynamically. Using our hierarchical configuration search space exploration, we show that we can decrease the memory footprint of a standard perceptron-based branch predictor by 2.3x with only a 0.6% decrease in prediction accuracy.
doi:10.1007/s11227-018-2374-x fatcat:nuirrf47erc4lamcbiby7ul75m