Skip to content

Computer Vision project to detect and classify ASL hand signs

Notifications You must be signed in to change notification settings

NimunB/HandSignDetection

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Computer Vision Project: Detection and Classification of ASL hand signs

This is a project that I completed in order to introduce myself to computer vision. I used OpenCV and CVZone's Hand Tracking Module to detect and capture various hand signs from American Sign Language (ASL). I then used Google's Teachable Machine to generate a machine learning model, which I applied to classify the hand signs in real-time. My goal is to expand this project and use PyTorch to create, train, and utilize a model using transfer learning. I also plan to evaluate my model in greater depth.

This model is trained to classify letters A to G of American Sign Language.

Dependencies:

  • cvzone 1.6.1
  • mediapipe 0.9.0.1
  • Tensorflow 2.9.1

Relevant Files and Folders:

  • dataCollection.py: This is a program which you can run to save hand signs in the designated folder. Pressing s will take a snapshot of the hand.
  • test.py: This is a program which you can run to classify your hand sign in real-time using the machine learning model. It will generate a pink rectangle around your hand and will have the letter the sign stands for above.
  • Data/: This folder contains folders of letters A through G with ~300 images of me making that sign, which was used for training purposes.
  • Model/keras_model.h5: This file contains the machine learning model from Teachable Machine. It is trained to classify letters A to G of American Sign Language.

Steps:

1) Detect Hand

Screen Shot 2023-09-10 at 12 45 55 PM

Using CVZone's Hand Tracking Module

2) Crop Hand and Overlay onto Square

Screen Shot 2023-09-10 at 2 57 35 PM

Cropping to square

Screen Shot 2023-09-10 at 11 14 45 PM

Filling background with white to stabilize dataset

3) Save Images and Data Collection

Ran dataCollection.py and clicked s to capture around 300 images of each hand sign, from A through G.

4) Training Model

Screen Shot 2023-09-10 at 11 52 48 PM

5) Apply Model

Screen Shot 2023-09-16 at 10 33 14 PM Screen Shot 2023-09-16 at 10 33 23 PM Screen Shot 2023-09-16 at 10 33 30 PM Screen Shot 2023-09-16 at 10 33 39 PM Screen Shot 2023-09-16 at 10 33 54 PM Screen Shot 2023-09-16 at 10 34 45 PM Screen Shot 2023-09-16 at 10 36 43 PM

Overall, the model achieved a 79% accuracy for this classification problem.

About

Computer Vision project to detect and classify ASL hand signs

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages