Skip to content

Jacob12138xieyuan/Hand-gesture-controled-Arduino-car-using-machine-learning-model

Repository files navigation

Hand-gesture-controled-Arduino-car-using-machine-learning-model

Overview

In some situations, we only have limited amout of dataset, but we still want to train a very robust machine learning model ourselves. So, it's a very good way to use some pre-trained models created by professionals before. These models have good structures and contain a large amout of pre-trained parameters which can deal with problem that you only have very small dataset. And you can easily load these models from 'Keras'. VGG16 is one of the good examples which has been widely used.

The basic structure of VGG16

17

Four main files

1. 'taking picture' file is to create your own dataset by taking a lot of pictures in each class

2. 'train_own_model_from_vgg16' file is to train your customized model by loading vgg16 from keras and using your own dataset

3. 'tkinter with opencv with keras model' file is to create a user interface embeded with hand gesture recognition using the model you trained before. After you click 'start' button, recognition is started. If you don't want to train model yourself, can run this file straight way using my model.

4. 'arduino_car' file is an arduino file needs to be uploaded to arduino board, which enables bluetooth and motors control.

Four classes

stop '0'

imag11

run forward '1'

imag20

turn left '2'

imag5

turn right '3'

imag3

tkinter GUI and test result

489z9rxq3e raajv f8 i6j

135hxg25a 4ta 1o7nx9 mx

z 6 _ xl 7aoqiq _ o3

g 1 vo i0rsy1 3jj brit