دورية أكاديمية

DARE: Diver Action Recognition Encoder for Underwater Human–Robot Interaction

التفاصيل البيبلوغرافية
العنوان: DARE: Diver Action Recognition Encoder for Underwater Human–Robot Interaction
المؤلفون: Jing Yang, James P. Wilson, Shalabh Gupta
المصدر: IEEE Access, Vol 11, Pp 76926-76940 (2023)
بيانات النشر: IEEE, 2023.
سنة النشر: 2023
المجموعة: LCC:Electrical engineering. Electronics. Nuclear engineering
مصطلحات موضوعية: Autonomous underwater vehicles, diver action recognition, human-robot interaction, bi-channel convolutional neural networks, transfer learning, Electrical engineering. Electronics. Nuclear engineering, TK1-9971
الوصف: With the growth of sensing, control and robotic technologies, autonomous underwater vehicles (AUVs) have become useful assistants to human divers for performing various underwater operations. In the current practice, the divers are required to carry expensive, bulky, and waterproof keyboards or joystick-based controllers for the supervision and control of AUVs. Therefore, diver action-based supervision is becoming increasingly popular because it is convenient, easier to use, faster, and cost effective. However, various environmental, diver, and sensing uncertainties make the underwater diver action recognition problem challenging. In this regard, this paper presents DARE, a diver action recognition encoder, which is robust to underwater uncertainties and classifies various diver actions including sixteen gestures and three poses with high accuracy. DARE is based on the fusion of stereo-pairs of underwater camera images using bi-channel convolutional layers for feature extraction followed by a systematically designed decision tree of neural network classifiers. DARE is trained using the Cognitive Autonomous Diving Buddy (CADDY) dataset, which consists of a rich set of images of different diver actions in real underwater environments. DARE requires only a few milliseconds to classify one stereo-pair, thus making it suitable for real-time implementation. The results show that DARE achieves up to 95.87% overall accuracy and 92% minimum class accuracy, thus verifying its robustnesss and reliability. Furthermore, a comparative evaluation against existing deep transfer learning architectures reveals that DARE improves the performance of baseline classifiers by up to 3.44% in the overall accuracy and 30% in the minimum class accuracy.
نوع الوثيقة: article
وصف الملف: electronic resource
اللغة: English
تدمد: 2169-3536
العلاقة: https://ieeexplore.ieee.org/document/10190624Test/; https://doaj.org/toc/2169-3536Test
DOI: 10.1109/ACCESS.2023.3298304
الوصول الحر: https://doaj.org/article/e7b3c111dbfe4333b7bbf0ace72a5909Test
رقم الانضمام: edsdoj.7b3c111dbfe4333b7bbf0ace72a5909
قاعدة البيانات: Directory of Open Access Journals
الوصف
تدمد:21693536
DOI:10.1109/ACCESS.2023.3298304