Skip to content
This repository has been archived by the owner on Jun 2, 2023. It is now read-only.

Learning low-level facial attributes to encode muscle movements as a dense vector for stuttering studies.

Notifications You must be signed in to change notification settings

SecureAIAutonomyLab/neuro-face-attention

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

20 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Neuro-Face-Attention.

Tensorflow implementation of Neuro-Face-Attention by Arun Das, Henry Chacon, and Paul Rad.

We tackle the problem of learning low-level facial attributes to encode muscle movements as a dense vector for stuttering studies.

Setup

The code is developed and tested on Python3.6. Required packages are listed in requirements.txt file.

Dataset

Raw videos maybe passed as a parameter to the dataset preprocessing scripts. However, the current data pipeline excepts face AU's as the input.

Dataset is required to be in the following folder structure for data processing scripts to work:

Dataset Root
    Subject 1
        Study 1
            Paradigm 1
            Paradigm 2
        Study 2
            Paradigm 1
            Paradigm 2
        ...
        Study n
            Paradigm 1
            Paradigm 2
    Subject 2
        Study 1
            Paradigm 1
            Paradigm 2
        Study 2
            Paradigm 1
            Paradigm 2
        ...
        Study n
            Paradigm 1
            Paradigm 2
    ...
    Subject n

Usage

There are several Jupyter Notebooks in the src folder. Please use the data pipeline file to pre-process your data. Model pipeline files have the deep learning architecture in them. Please use them to train the models. Each trained model will be saved in specified directories.

Dependencies

About

Learning low-level facial attributes to encode muscle movements as a dense vector for stuttering studies.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published