Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
A Smart Sign Language Interpreter for Medical Environments Using Deep Learning: Morocco Case Study
Ist Teil von
2024 2nd International Conference on Mechatronics, Control and Robotics (ICMCR), 2024, p.21-26
Ort / Verlag
IEEE
Erscheinungsjahr
2024
Link zum Volltext
Quelle
IEEE Xplore
Beschreibungen/Notizen
This paper presents a smart system that aims to improve communication for individuals who primarily use sign language and those who do not master it. Innovative solutions in gesture recognition can effectively address and bridge the communication gap that exists between hearing individuals and people who are deaf, deafened, hard of hearing, or non-verbal. The proposed system integrates surface electromyography (EMG) and Inertial Measurement Unit (IMU) sensors with Convolutional Neural Networks (CNN) to accurately interpret Moroccan Sign Language (MSL) and convert it into spoken language. The system of interpretation of sign language holds the overarching goal of contributing to the social integration of individuals with disabilities in medical and hospital environments.