Download PDFOpen PDF in browser

Sign Language Detection Using Deep Learning

EasyChair Preprint no. 9775

4 pagesDate: February 24, 2023


The presented paper deals with automatic Deep learning model based Sign Language Detection. Detecting the Sign Language Detection we design a real-time human computer interaction system based on hand gesture using an auto camera, whereby, based on the captured image, the neural network recognizes whether the driver is awake or tired. The convolutional neural network (CNN) technology has been used as a component of a neural network, where each frame is evaluated separately and the average of the last 20 frames is evaluated, which corresponds to approximately one second in the training and test dataset. First, we analyze methods of image segmentation, and develop a model based on convolutional neural networks. Using an annotated dataset of more than 2000 image slices we train and test the segmentation network to extract the driver emotional status from the images.

Keyphrases: CNN, Hand Guessers, HCI, Sign Language Detection.

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
  author = {G. Sreenivasulu and Srilatha Kadari},
  title = {Sign Language Detection Using Deep Learning},
  howpublished = {EasyChair Preprint no. 9775},

  year = {EasyChair, 2023}}
Download PDFOpen PDF in browser