LADNet: an ultra-lightweight and efficient Dilated Residual Network with Light-Attention Module

Junyan Yang, Jie Jiang, Yujie Fang, Jiahao Sun
2021 IEEE Access  
Image classification task is an important branch of computer vision. At present, most of the mainstream CNNs are large in size and take up too much computing resources. The quality-price ratio is not satisfying when the heavy CNNs are used in image classification. So, this work proposes a spatial and channel hybrid attention module (Light-Attention module), an ultra-lightweight but efficient attention module. Given an intermediate feature map, the Light-Attention module will firstly derive the
more » ... ost important attention maps of the channels automatically with global stochastic pooling and Multilayer Perceptron (MLP). Then, the residual structure of Light-Attention module helps to repeatedly introduce global information for secondary screening. For better performance, our pioneering Max module and Mean module to extract key spatial features on the basis of previous operations. In the process of extracting the key attention map, LADNet uses the most labor-saving operation and save a lot of network parameters. The Light-Attention module can be seamlessly integrated into any network, while its parameters and required computing resources are both very small. We verified the excellent performance of the Light-Attention module in extracting key image information through ablation experiments on the Cifar-10, Cifar-100 and ImageNet datasets. For the experiment in Cifar-10, the image classification accuracy achieves 98.7%, achieves 96.5% for the Cifar-100, and for the ImageNet, achieves 83.9%. With the Light-Attention module, LADNet reduces the parameters of convolution neural network to 71% of the original. Compared to the other CNNs used in image classification task, the Light-Attention module uses the least parameters but scores most advanced achievements. The experiments proved that the combination of DRN (Dilated Residual Network) and Light-Attention module achieves superior performance. INDEX TERMS Light-Attention module, global stochastic pooling, DRN (Dilated Residual Network).
doi:10.1109/access.2021.3065338 fatcat:nqqjgzirfrcuzftgphimzqft64