Low‐complexity neuron for fixed‐point artificial neural networks with ReLU activation function in energy‐constrained wireless applications

Wen‐Long Chin, Qinyu Zhang, Tao Jiang
2021 IET Communications  
This work introduces an efficient neuron design for fixed-point artificial neural networks with the rectified linear unit (ReLU) activation function for energy-constrained wireless applications. The fixed-point binary numbers and ReLU activation function are used in most application-specific integrated circuit designs and artificial neural networks (ANN), respectively. It is well known that, owing to involved computation intensive tasks, the computational burden of ANNs is ultra heavy.
more » ... tly, many practitioners and researchers are discovering the ways to reduce implementation complexity of ANNs, particularly for battery-powered wireless applications. For this, a low-complexity neuron to predict the sign bit of the input of the non-linear activation function, ReLU, by employing the saturation characteristics of the activation function is proposed. According to our simulation results based on random data, computation overhead of a neuron using the proposed technique can be saved by a ratio of 29.6% compared to the conventional neuron using a word length of 8 bits without apparently increasing the prediction error. A comparison of the proposed algorithm with the popular 16-bit fixed-point format of the convolutional network, AlexNet, indicates that the computation can be saved by 48.58% as well. IET Commun. 2021;15:917-923. wileyonlinelibrary.com/iet-com 917
doi:10.1049/cmu2.12129 fatcat:l52xiiaaqrff5mjhjg2w64bx44