Skip to content

yuanjunchai/IKC

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

91 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

IKC: Blind Super-Resolution With Iterative Kernel Correction

Here is the implementation of 'Blind Super-Resolution With Iterative Kernel Correction'.
Based on [BasicSR], [MMSR]. About more details, check BasicSR.
Thanks to Jinjin Gu and Xintao Wang.

Updates

[2019-09-22] IKC v0.1 is modified.
[2019-09-25] IKC v0.2 is modified. User could use .yaml to change different settings(scale, sigma, etc.)

Architecture

Kernel mismatch

Dependencies

  • Python 3 (Recommend to use Anaconda)
  • PyTorch >= 1.0
  • NVIDIA GPU + CUDA
  • Python packages: pip install numpy opencv-python lmdb pyyaml
  • TensorBoard:
    • PyTorch >= 1.1: pip install tb-nightly future
    • PyTorch == 1.0: pip install tensorboardX

Installation

  • Clone this repo:
git clone https://github.com/yuanjunchai/IKC.git
cd IKC

Dataset Preparation

We use DIV2K, Flickr2K, Set5, Set14, Urban100, BSD100 datasets. To train a model on the full dataset(DIV2K+Flickr2K, totally 3450 images), download datasets from official websites. After download, run codes/scripts/generate_mod_LR_bic.py to generate LRblur/LR/HR/Bicubic datasets paths and corresponding kernel map.

python codes/scripts/generate_mod_LR_bic.py

About data

When train, dataset_GT is used to produce actual LR and corresponding kernel in train_IKC.py and train_SFTMD.py. Therefore, dataset_LQ is not used.
When test, the operation is the same as above in test_SFTMD.py so as to get kernel maps.
However, you need to change dataset_LQ in test_IKC.py!!
Another method is use generate_mod_LR_bic.py.

Getting Started

Pretrained model

You could download the pre-trained models from ./checkpoints directory.
Remember: change opt['path']['pretrain_model_G'] of the .yaml to the models' path you saved.

Train

First, train SFTMD network, and then use pretrained SFTMD to train Predictor and Corrector networks iteratively.

  1. To train the SFTMD model, change image path of codes/options/train/train_SFTMD.yml, especially dataroot_GT, dataroot_LQ. You could change opt['name'] to save different checkpoint filenames, and change opt['gpu_ids'] to assign specific GPU.
python codes/train_SFTMD.py -opt_F codes/options/train/train_SFTMD.yml
  1. To train Predictor and Corrector models, you first should change opt_F['sftmd']['path']['pretrain_model_G'] to the path of pretrained SFTMD checkpoint. Also, dataroot_GT, dataroot_LQ of opt_P, opt_C should be filled with corresponding train&validation data paths.
python codes/train_IKC.py -opt_F codes/options/train/train_SFTMD.yml -opt_P codes/options/train/train_Predictor.yml -opt_C codes/options/train/train_Corrector.yml

Test

  1. At first, you'd better run codes/scripts/generate_mod_LR_bic.py to generate LRblur/LR/HR/Bicubic datasets paths and corresponding kernel map.
python codes/scripts/generate_mod_LR_bic.py
  1. To test SFTMD model, change test datasets paths of codes/options/test/test_SFTMD.yml.
python codes/test_SFTMD.py -opt_F codes/options/test/test_SFTMD.yml
  1. To test Predictor and Corrector models, change datasets paths of codes/options/test/test_Predictor.yml and codes/options/test/test_Corrector.yml.
python codes/test_IKC.py -opt_F codes/options/test/test_SFTMD.yml -opt_P codes/options/test/test_Predictor.yml -opt_C codes/options/test/test_Corrector.yml

The 'dataroot_GT' is only used as PSNR calculation. If you'd like to use it in blind-SR, you could set 'dataroot_GT:~' and just use your own LR data.

Citation

@InProceedings{gu2019blind,
    author = {Gu, Jinjin and Lu, Hannan and Zuo, Wangmeng and Dong, Chao},
    title = {Blind super-resolution with iterative kernel correction},
    booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
    month = {June},
    year = {2019}
}

About

Implementation of 'Blind Super-Resolution With Iterative Kernel Correction' (CVPR2019)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published