Skip to content

thefatbandit/FDA

 
 

Repository files navigation

FDA: Fourier Domain Adaptation for Semantic Segmentation.

This is the Pytorch implementation of our FDA paper published in CVPR 2020.

Domain adaptation via style transfer made easy using Fourier Transform. FDA needs no deep networks for style transfer, and involves no adversarial training. Below is the diagram of the proposed Fourier Domain Adaptation method:

Step 1: Apply FFT to source and target images.

Step 2: Replace the low frequency part of the source amplitude with that from the target.

Step 3: Apply inverse FFT to the modified source spectrum.

Image of FDA

We have prepared a well documented version of the original repository with the code flow available here.

Usage

  1. FDA Demo

    python3 FDA_demo.py

    An example of FDA for domain adaptation. (source: GTA5, target: CityScapes, with beta 0.01)

    Image of Source

  2. Sim2Real Adaptation Using FDA (single beta)

    python3 train.py --snapshot-dir='../checkpoints/FDA' --init-weights='../checkpoints/FDA/init_weight/DeepLab_init.pth' --LB=0.01 --entW=0.005 --ita=2.0 --switch2entropy=0

    Important: use the original images for FDA, then do mean subtraction, normalization, etc. Otherwise, will be numerical artifacts.

    DeepLab initialization can be downloaded through this link.

    LB: beta in the paper, controls the size of the low frequency window to be replaced.

    entW: weight on the entropy term.

    ita: coefficient for the robust norm on entropy.

    switch2entropy: entropy minimization kicks in after this many steps.

  3. Evaluation of the Segmentation Networks Adapted with Multi-band Transfer (multiple betas)

    python3 evaluation_multi.py --model='DeepLab' --save='../results' --restore-opt1="../checkpoints/FDA/gta2city_deeplab/gta2city_LB_0_01" --restore-opt2="../checkpoints/FDA/gta2city_deeplab/gta2city_LB_0_05" --restore-opt3="../checkpoints/FDA/gta2city_deeplab/gta2city_LB_0_09"

    Pretrained models on the GTA5 -> CityScapes task using DeepLab backbone can be downloaded here.

    The above command should output: ===> mIoU19: 50.45 ===> mIoU16: 54.23 ===> mIoU13: 59.78

  4. Get Pseudo Labels for Self-supervised Training

    python3 getSudoLabel_multi.py --model='DeepLab' --data-list-target='./dataset/cityscapes_list/train.txt' --set='train' --restore-opt1="../checkpoints/FDA/gta2city_deeplab/gta2city_LB_0_01" --restore-opt2="../checkpoints/FDA/gta2city_deeplab/gta2city_LB_0_05" --restore-opt3="../checkpoints/FDA/gta2city_deeplab/gta2city_LB_0_09"

  5. Self-supervised Training with Pseudo Labels

    python3 SStrain.py --model='DeepLab' --snapshot-dir='../checkpoints/FDA' --init-weights='../checkpoints/FDA/init_weight/DeepLab_init.pth' --label-folder='cs_pseudo_label' --LB=0.01 --entW=0.005 --ita=2.0

  6. Other Models

    VGG initializations can be downloaded through this link.

    python3 train.py --model='VGG' --learning-rate=1e-5 --snapshot-dir='../checkpoints/FDA' --init-weights='../checkpoints/FDA/init_weight/vggfcn_gta5_init.pth' --LB=0.01 --entW=0.005 --ita=2.0 --switch2entropy=0

    Pretrained models on the Synthia -> CityScapes task using DeepLab backbone link.

    Pretrained models on the GTA5 -> CityScapes task using VGG backbone link.

    Pretrained models on the Synthia -> CityScapes task using VGG backbone link.

Additional Work Done by AGV.AI (IITKGP) : victorvini08, karan-uppal3, saurabhmishra608, KaLiMaLi555

  1. Models trained by the team at AGV.AI (IITKGP)

    Beta Value DeepLab VGG16
    0.01 (T=0) link link
    0.05 (T=0) link link
    0.09 (T=0) link link
    0.01 (T=1) link link
    0.05 (T=1) link link
    0.09 (T=1) link link
    0.01 (T=2) link link
    0.05 (T=2) link link
    0.09 (T=2) link link
  2. Files train.py and SStrain_VGG.py are integrated with wandb and will log the source and target loss at the frequency inputted in the argument. You will just have to login during the initial run of the code which is done using

    wandb.login()

    And the logging process is started using

    wandb.init()

    The values are logged using the command

    wandb.log(..)

  3. Our team has optimised the pseudo label generation code (getSudoLabel_multi.py) and the difference between the pseudo labels generated from the original and optimised code is shown below:

Image of Pseudo Labels

Acknowledgment

Code adapted from BDL.

About

Fourier Domain Adaptation for Semantic Segmentation

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%