Skip to content

nhtlongcs/liveness-detection

Repository files navigation

Liveness detection

A strong baseline for liveness detection. The challenge is a part of the ZaloAI Challenge Series, a series of challenges organized by ZaloAI to promote AI research in Vietnam. The source code could be used for similar tasks, such as face anti-spoofing or detecting fake videos. Liveness detection

Table of Contents

Problem statement

In verification services related to face recognition (such as eKYC and face access control), the key question is whether the input face video is real (from a live person present at the point of capture), or fake (from a spoof artifact or lifeless body). Liveness detection is the AI problem to answer that question.

In this challenge, participants will build a liveness detection model to classify if a given facial video is real or spoofed.

  • Input: a video of selfie/portrait face with a length of 1-5 seconds (you can use any frames you like).

  • Output: Liveness score in [0...1] (0 = Fake, 1 = Real).

Example Output: Predict.csv

fname liveness_score
VideoID.mp4 0.10372
,,, ...
,,, ...
,,, ...

Features

Currently, the following features are supported:

  • Training and evaluation code for liveness detection, frame-level classification / face-level classification.
  • Support for training on multiple GPUs.
  • Automatic mixed precision training.
  • Auto find best learning rate.
  • Support EfficientNet, ViT, etc.
  • Manage experiments with Weights & Biases.
  • Code management with registry, config, and logging.
  • Dockerfile for deployment.
  • Unit tests.
  • Packaging available.
  • Support semi-supervised learning on external unlabeled data.

In the future, we will add more features, such as:

If you have any suggestions, please feel free to open an issue or pull request. If you want to contribute to this project, please read the contribution guide. This project is a part of my research template, which is a collection of research projects and tools. The template is designed to help researchers to build their own research projects easily and quickly and achieve the best performance with the least effort. The template is still in the early stage of development, so i really appreciate any feedback and contribution.

Environment

For necessary packages, please refer to environment.yml. You can create a conda environment with the following command:

conda env create -f environment.yml 
conda activate zaloai

Alternatively, you can use the docker image provided by us. Please refer to the Dockerfile for more details.

Data preparation

Read the data preparation guide for more details.

Training, evaluation, and inference

Provided scripts are in scripts folder. All scripts have the same interface which requires the following arguments:

  • -c or --config: path to the config file
  • -o or --opt: additional options to override the config file (e.g. --opt extractor.name=efficientnet) For example, checkout the provided config files in configs folder and training instructions in train.ipynb notebook. Same for evaluation and inference in predict.ipynb notebook.

Docker

For deployment/training purpose, docker is an ready-to-use solution. We already provide a ready-to-use docker image in Dockerhub. You can pull the image with the following command:

docker pull nhtlongcs/liveness-detection

To start docker container in interactive mode:

# With device is the GPU device number, and shm-size is the shared memory size 
# should be larger than the size of the model
$ docker run --rm --name nhtlongcs/liveness-detection --gpus device=0,1 --shm-size 16G -it -v $(pwd)/:/home/workspace/src/ nhtlongcs/liveness-detection:infection /bin/bash

We also provide instructions to build docker image and many other useful commands in docker.md.

Acknowledgement

The base solution is inspired by this discussion.

About

A strong baseline for liveness detection. The source code could be used for similar tasks, such as face anti-spoofing or detecting fake videos

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published