ThreshNet: An Efficient DenseNet Using Threshold Mechanism to Reduce Connections

التفاصيل البيبلوغرافية
العنوان: ThreshNet: An Efficient DenseNet Using Threshold Mechanism to Reduce Connections
المؤلفون: Ju, Rui-Yang, Lin, Ting-Yu, Jian, Jia-Hao, Chiang, Jen-Shiun, Yang, Wei-Bin
سنة النشر: 2022
المجموعة: Computer Science
مصطلحات موضوعية: Computer Science - Computer Vision and Pattern Recognition
الوصف: With the continuous development of neural networks for computer vision tasks, more and more network architectures have achieved outstanding success. As one of the most advanced neural network architectures, DenseNet shortcuts all feature maps to solve the model depth problem. Although this network architecture has excellent accuracy with low parameters, it requires an excessive inference time. To solve this problem, HarDNet reduces the connections between the feature maps, making the remaining connections resemble harmonic waves. However, this compression method may result in a decrease in the model accuracy and an increase in the parameters and model size. This network architecture may reduce the memory access time, but its overall performance can still be improved. Therefore, we propose a new network architecture, ThreshNet, using a threshold mechanism to further optimize the connection method. Different numbers of connections for different convolution layers are discarded to accelerate the inference of the network. The proposed network has been evaluated with image classification using CIFAR 10 and SVHN datasets under platforms of NVIDIA RTX 3050 and Raspberry Pi 4. The experimental results show that, compared with HarDNet68, GhostNet, MobileNetV2, ShuffleNet, and EfficientNet, the inference time of the proposed ThreshNet79 is 5%, 9%, 10%, 18%, and 20% faster, respectively. The number of parameters of ThreshNet95 is 55% less than that of HarDNet85. The new model compression and model acceleration methods can speed up the inference time, enabling network models to operate on mobile devices.
Comment: IEEE Access
نوع الوثيقة: Working Paper
DOI: 10.1109/ACCESS.2022.3196492
الوصول الحر: http://arxiv.org/abs/2201.03013Test
رقم الانضمام: edsarx.2201.03013
قاعدة البيانات: arXiv