Skip to content
#

sign-language

Here are 213 public repositories matching this topic...

Implemented normalized, polar and delta feature sets, cross validation folds, Bayesian Information Criterion and Discriminative Information Criterion model selectors, as well as the recognizer in order to detect and translate sign language into text using hidden markov models as part of the Udacity Artificial Intelligence Nanodegree.

  • Updated Apr 23, 2017
  • HTML

We help the deaf and the dumb to communicate with normal people using hand gesture to speech conversion. In this code we use depth maps from the kinect camera and techniques like convex hull + contour mapping to recognise 5 hand signs

  • Updated Jul 27, 2017
  • Python

This project is investigating the use of various machine learning techniques with various gesture recognition devices to recognise gestures from the South African Sign Language alphabet. Three devices are being used, namely the Leap Motion Controller, Microsoft Kinect and Myo. This project is subject to the intellectual copyright terms stipulate…

  • Updated Oct 3, 2017
  • Python

Context Sign languages (also known as signed languages) are languages that use manual communication to convey meaning. This can include simultaneously employing hand gestures, movement, orientation of the fingers, arms or body, and facial expressions to convey a speaker's ideas. Source: https://en.wikipedia.org/wiki/Sign_language

  • Updated Jul 12, 2018
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the sign-language topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the sign-language topic, visit your repo's landing page and select "manage topics."

Learn more