A copy of this work was available on the public web and has been preserved in the Wayback Machine. The capture dates from 2021; you can also visit the original URL.
The file type is
This work introduces an efficient neuron design for fixed-point artificial neural networks with the rectified linear unit (ReLU) activation function for energy-constrained wireless applications. The fixed-point binary numbers and ReLU activation function are used in most application-specific integrated circuit designs and artificial neural networks (ANN), respectively. It is well known that, owing to involved computation intensive tasks, the computational burden of ANNs is ultra heavy.doi:10.1049/cmu2.12129 fatcat:l52xiiaaqrff5mjhjg2w64bx44