Assistive Device for the Translation from Mexican Sign Language to Verbal Language
Abstract
In this work, we present the design and implementation of an assistive device for people with hearing disabilities, which allows words from Mexican Sign Language to be translated into verbal language. The device consists of a wearable embedded computer with a camera and a pair of gloves. The system captures the hand-gesture images, extracts features from the gloves, and runs an Artificial Neural Network as a classifier. Our system achieves an average precision of 88% and an average recall of 90% on 20 signals.
Keywords
Assistive devices, mexican sign language, computer vision