Skip to content

intellerce/controlanimate

Repository files navigation

ControlAnimate

  • Combining AnimateDiff with Multi-ControlNet and Img2Img for Vid2Vid applications. This small library is especially focused on Vid2Vid applications by using ControlNet (or Multi-ControlNet) to guide the video generation and AnimateDiff for concistency.
  • In addition it uses Img2Img for creating more consistent videos (after the first epoch). Similar to AnimateDiff it allows the use of DreamBooth/LoRA models in addition to the Stable Diffusion 1.5 base model.
  • This is an initial release so please expect potential issues and bugs. Feedback, suggestions, and feature requests are welcome.

ControlAnimate

News

  • 🔥 Nov. 20, 2023 - Now supporting IP-Adapter, xformers, and Color Matching!
  • 🔥 Nov. 12, 2023 - Now supporting LCM-LoRA & ControlNet for all combinations!
  • 🔥 Nov. 7, 2023 - Now supporting Latent Consistency Model (LCM) - Achieving 10X performance gain!

Supported Features

  • 💥 IP Adapter (Used for Increasing the Similarity of Batches of AnimateDiff Frames)
  • 💥 Latent Consistency Model LoRA (LCM-LoRA)
  • 💥 Latent Consistency Model (LCM) Native
  • 💥 Multi-ControlNet can be Combined with LCM, etc.
  • 💥 Prompt Weighting and Long Prompts (Compel)
  • 💥 DreamBooth & LoRA
  • 💥 FFMPEG Interpolation
  • 💥 Color Matching Between Batches for Improved Consistency
  • 💥 Latent Overlapping (Img2Img & ControlNet) & Frame Overlapping (Blending)
  • 💥 Face Enhancement and Upscaling (GFPGAN & RealESRGAN)
  • 💥 Arbitrary Frame Rate, Duration, and Resolution Sampling of the Input Video
  • 💥 xformers Enabled

Compatibility and Requirements

  • This codebase was tested on Linux (Ubuntu 22.04) only. It was tested on an Intel machine with NVIDIA Gefore RTX 3090 (24 GB VRAM) and requires at least 16 GB of RAM.

Installation

  • Make sure you have Anaconda installed (https://www.anaconda.com/download).
  • Also make sure that FFMPEG is properly installed and set up (you can follow these guides for the installation: "Guide 1" and if there are still issues this: "Guide 2" - You can set the FFMPEG path in the configs/prompts yaml files)
git clone git@github.com:intellerce/controlanimate.git
cd ControlAnimate

bash download.sh

conda env create -f env.yml

Vid2Vid

  • After setting the config file 'configs/prompts/SampleConfig.yaml', simply run the following (don't forget to point to a valid input video file):
conda activate controlanimate
bash start.sh

Tested on a machine with a single RTX 3090.

Prompt Weighting

Results

  • Four ControlNets and Latent Overlapping (configs/prompts/SampleConfig.yaml) ControlAnimate
  • LCM (No ControlNet) (configs/prompts/SampleConfigLCM.yaml) ControlAnimate
  • LCM-LoRA + Multi-ControlNet (configs/prompts/SampleConfigLCMLoRA.yaml) ControlAnimate
  • IP-Adapter + LCM-LoRA + Multi-ControlNet (configs/prompts/SampleConfigIPAdapter.yaml) ControlAnimate

Todo

  • GitHub Release
  • Bug Fixes and Improvements
  • Fixing xformers Issues and GPU Memory Optimization
  • Windows Support
  • Interface

Contact Us

Hamed Omidvar, Ph.D.: hamed.omidvar@intellerce.com
Vahideh Akhlaghi, Ph.D.: vahideh.akhlaghi@intellerce.com

License

This codebase is released under the Apache v2.0 license. For the licenses of the codebases that this repository is based on please refer to their corresponding Github/Website pages.

Acknowledgements

This codebase was built upon and/or inspired by the following repositories: AnimateDiff Diffusers IP-Adapter Video2Video Color Matcher

The authors would like to thank Kalin Ovtcharov (Extropolis Corp.) for invaluable feedback and suggestions.