Skip to content

junleen/Expression-manipulator

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Toward Fine-grained Facial Expression Manipulation (ECCV 2020, Paper)

Python 3.6 Pytorch 0.4.1 Pytorch 1.3.1

cover Arbitrary Facial Expression Manipulation. Our model can 1) perform continuous editing between two expressions (top); 2) learn to only modify one facial component(middle); 3) transform expression in paintings (bottom). From left to right, the emotion intensity is set to 0, 0.5, 0.75, 1, and 1.25.

Single/Multiple AU Editing

single-au Single/multiple au Editing. AU4: Brow Lowerer; AU5: Upper Lid Raiser; AU7: Lid Tightener; AU12: Lip Corner Puller; AU15: Lip Corner Depressor; AU20: Lip Stretcher. The legend below the images are relative AUs intensity. The higher (lower) AUs value means to strengthen (weaken) the corresponding facial action unit in input image.

Expression Transfer

arbitrary Arbitrary Facial Expresson Manipulation. The top-left image with blue box is input, the images in odd rows are image with target expression, the images in even rows are animated results.

gif

Resources

Here are some links for you to know Action Units better.

Prerequisites

  • Install PyTorch(version==0.4.1 or >= 1.3.0), torchvision

  • Install requirements.txt

    numpy
    matplotlib
    tqdm
    pickle
    opencv-python
    tensorboardX
    face_alignment

Getting Started

1. Data Preparation

  • prepare your images (EmotionNet, or AffectNet, etc.)

  • Extract the Action Units with OpenFace, and generate aus_dataset.pkl which contains a list of dict, e.g., [{'file_path': <path of image1>, 'aus':<extracted aus of image1>}, {'file_path': <path of image2>, 'aus':<extracted aus of image2>}]

  • Please refer to src/samples/aus_dataset.pkl

  • You may use the function of pickle to save pkl file

    with open('aus_dataset.pkl', 'wb') as f:
        pickle.dump(data, f, pickle.HIGHEST_PROTOCOL)

2. Training

To train, please modify the parameters in launch/train.sh and run:

bash launch/train.sh

Citation

If you find this repository helpful, use this code or adopt ideas from the paper for your research, please cite:

@inproceedings{ling2020toward,
  title={Toward Fine-grained Facial Expression Manipulation},
  author={Ling, Jun and Xue, Han and Song, Li and Yang, Shuhui and Xie, Rong and Gu, Xiao},
  booktitle={European Conference on Computer Vision},
  pages={37--53},
  year={2020},
  organization={Springer}
}

Contact

Please contact lingjun@sjtu.edu.cn or open an issue for any questions or suggestions.

Acknowledgement

About

ECCV'20 paper 'Toward Fine-grained Facial Expression Manipulation' code

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published