UAV With the Ability to Control with Sign Language and Hand by Image Processing

Automatic recognition of sign language from hand gesture images is crucial for enhancing human-robot interaction, especially in critical scenarios such as rescue operations. In this study, we employed a DJI TELLO drone equipped with advanced machine vision capabilities to recognize and classify sig...

Full description

Saved in:
Bibliographic Details
Main Authors: Hediyeh Hojaji, Alireza Delisnav, Mohammad Hossein Ghafouri Moghaddam, Fariba Ghorbani, Shadi Shafaghi, Masoud Shafaghi
Format: Article
Language:English
Published: Andalas University 2024-09-01
Series:JITCE (Journal of Information Technology and Computer Engineering)
Subjects:
Online Access:http://10.250.30.20/index.php/JITCE/article/view/246
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1823864574275747840
author Hediyeh Hojaji
Alireza Delisnav
Mohammad Hossein Ghafouri Moghaddam
Fariba Ghorbani
Shadi Shafaghi
Masoud Shafaghi
author_facet Hediyeh Hojaji
Alireza Delisnav
Mohammad Hossein Ghafouri Moghaddam
Fariba Ghorbani
Shadi Shafaghi
Masoud Shafaghi
author_sort Hediyeh Hojaji
collection DOAJ
description Automatic recognition of sign language from hand gesture images is crucial for enhancing human-robot interaction, especially in critical scenarios such as rescue operations. In this study, we employed a DJI TELLO drone equipped with advanced machine vision capabilities to recognize and classify sign language gestures accurately. We developed an experimental setup where the drone, integrated with state-of-the-art radio control systems and machine vision techniques, navigated through simulated disaster environments to interact with human subjects using sign language. Data collection involved capturing various hand gestures under various environmental conditions to train and validate our recognition algorithms, including implementing YOLO V5 alongside Python libraries with OpenCV. This setup enabled precise hand and body detection, allowing the drone to navigate and interact effectively. We assessed the system's performance by its ability to accurately recognize gestures in both controlled and complex, cluttered backgrounds. Additionally, we developed robust debris and damage-resistant shielding mechanisms to safeguard the drone's integrity. Our drone fleet also established a resilient communication network via Wi-Fi, ensuring uninterrupted data transmission even with connectivity disruptions. These findings underscore the potential of AI-driven drones to engage in natural conversational interactions with humans, thereby providing vital information to assist decision-making processes during emergencies. In conclusion, our approach promises to revolutionize the efficacy of rescue operations by facilitating rapid and accurate communication of critical information to rescue teams.
format Article
id doaj-art-b4a22ef305044fc9b6a1d94f14cf0c32
institution Kabale University
issn 2599-1663
language English
publishDate 2024-09-01
publisher Andalas University
record_format Article
series JITCE (Journal of Information Technology and Computer Engineering)
spelling doaj-art-b4a22ef305044fc9b6a1d94f14cf0c322025-02-08T21:25:57ZengAndalas UniversityJITCE (Journal of Information Technology and Computer Engineering)2599-16632024-09-018210.25077/jitce.8.2.49-57.2024UAV With the Ability to Control with Sign Language and Hand by Image ProcessingHediyeh Hojaji0Alireza Delisnav1Mohammad Hossein Ghafouri Moghaddam2Fariba Ghorbani3Shadi Shafaghi4Masoud Shafaghi5Research Innovation Teams, TehranTechnical Specialist, Research Innovation Teams, TehranTechnical Specialist, Research Innovation Teams, TehranTracheal Diseases Research Center, National Research Institute of Tuberculosis and Lung Diseases, Shahid Beheshti University of Medical Sciences, TehranLung Transplantation Research Center, National Research Institute of Tuberculosis and Lung Diseases, Shahid Beheshti University of Medical Sciences, TehranStrategic Planning and Executive Office Manager, International Federation of Inventors' Associations, Geneva Automatic recognition of sign language from hand gesture images is crucial for enhancing human-robot interaction, especially in critical scenarios such as rescue operations. In this study, we employed a DJI TELLO drone equipped with advanced machine vision capabilities to recognize and classify sign language gestures accurately. We developed an experimental setup where the drone, integrated with state-of-the-art radio control systems and machine vision techniques, navigated through simulated disaster environments to interact with human subjects using sign language. Data collection involved capturing various hand gestures under various environmental conditions to train and validate our recognition algorithms, including implementing YOLO V5 alongside Python libraries with OpenCV. This setup enabled precise hand and body detection, allowing the drone to navigate and interact effectively. We assessed the system's performance by its ability to accurately recognize gestures in both controlled and complex, cluttered backgrounds. Additionally, we developed robust debris and damage-resistant shielding mechanisms to safeguard the drone's integrity. Our drone fleet also established a resilient communication network via Wi-Fi, ensuring uninterrupted data transmission even with connectivity disruptions. These findings underscore the potential of AI-driven drones to engage in natural conversational interactions with humans, thereby providing vital information to assist decision-making processes during emergencies. In conclusion, our approach promises to revolutionize the efficacy of rescue operations by facilitating rapid and accurate communication of critical information to rescue teams. http://10.250.30.20/index.php/JITCE/article/view/246Hand gesture recognitionArtificial intelligenceConvolutional neural networkMachine visionImage processing ai
spellingShingle Hediyeh Hojaji
Alireza Delisnav
Mohammad Hossein Ghafouri Moghaddam
Fariba Ghorbani
Shadi Shafaghi
Masoud Shafaghi
UAV With the Ability to Control with Sign Language and Hand by Image Processing
JITCE (Journal of Information Technology and Computer Engineering)
Hand gesture recognition
Artificial intelligence
Convolutional neural network
Machine vision
Image processing ai
title UAV With the Ability to Control with Sign Language and Hand by Image Processing
title_full UAV With the Ability to Control with Sign Language and Hand by Image Processing
title_fullStr UAV With the Ability to Control with Sign Language and Hand by Image Processing
title_full_unstemmed UAV With the Ability to Control with Sign Language and Hand by Image Processing
title_short UAV With the Ability to Control with Sign Language and Hand by Image Processing
title_sort uav with the ability to control with sign language and hand by image processing
topic Hand gesture recognition
Artificial intelligence
Convolutional neural network
Machine vision
Image processing ai
url http://10.250.30.20/index.php/JITCE/article/view/246
work_keys_str_mv AT hediyehhojaji uavwiththeabilitytocontrolwithsignlanguageandhandbyimageprocessing
AT alirezadelisnav uavwiththeabilitytocontrolwithsignlanguageandhandbyimageprocessing
AT mohammadhosseinghafourimoghaddam uavwiththeabilitytocontrolwithsignlanguageandhandbyimageprocessing
AT faribaghorbani uavwiththeabilitytocontrolwithsignlanguageandhandbyimageprocessing
AT shadishafaghi uavwiththeabilitytocontrolwithsignlanguageandhandbyimageprocessing
AT masoudshafaghi uavwiththeabilitytocontrolwithsignlanguageandhandbyimageprocessing