Researchers have been working on the recognition of sign language gestures. The development of pattern recognition techniques can assist in communicating the sign language users. The systems based on wearable sensors can be applied in different applications, such as virtual reality and games. This work employs classifers based on Recurrent Neural Network (RNN) to recognize Brazilian Sign Language (Libras) gestures. Unlike the traditional classifers, the classifcation with RNNs can be realized without a manual feature extraction step. An instrumented glove composed of flex, inertial, and contact sensors was used to acquire data and build a database to analyze the RNN models. Data from ten gestures in Libras were obtained with ten volunteers, and Long Short-Term Memory (LSTM) and Gated Recurrent Unit (GRU) classifers were implemented for gesture recognition. The best accuracy was 98.25%, obtained with the GRU classifer. The best mean accuracy obtained for the LSTM model was 98%. The results showed that RNNs can recognize sign language gestures (acquired with instrumented glove) with good results, and the statistical test showed that the results obtained with LSTM and GRU models have similar distributions.
Classifcation of Brazilian Sign Language Gestures Based on Recurrent Neural Networks Models, with Instrumented Glove