Skip to content

ys-koshelev/lgd

Repository files navigation

LGD

Repository implementing Learnable Gradient Descent method for image restoration via recurrent neural networks (RNNs). Project is mainly based on this paper.

Requirements

All requirements are stored in requirements.txt. To install them, just run pip intall requirements.txt

Pretrained models

All pretrained models can be downloaded from here. You can use them to run inference in corresponding Jupyter Notebooks

Experiments description

For convenience all inference experiments are given as corresponding Jupyter Notebooks:

Path to notebook Description
RNN_denoising.ipynb Denoising, using learned gradient descent network
RNN_deblurring.ipynb Deblurring, using learned gradient descent network
RNN_super-resolution.ipynb Super-Resolution, using learned gradient descent network
TV_denoising_LBFGS.ipynb Denoising, using total-variation restoration with L-BFGS minimizer
TV_deblurring_LBFGS.ipynb Deblurring, using total-variation restoration with L-BFGS minimizer
TV_super-resolution_LBFGS.ipynb Super-Resolution, using total-variation restoration with L-BFGS minimizer
TV_segsynthesis_LBFGS.ipynb Semantic Synthesis, using total-variation restoration with L-BFGS minimizer (unsuccessful)

Datasets

BSD500 for training all linear problems

BSD68 for testing all linear problems

ADE20K for playing with semantic synthesis

Additional information

This work is related to final project on 2020 Bayesian Methods of Machine Learning course at Skoltech. Initially I am trying to reproduce a restoration method, whcih is now commonly known as a learnable gradient descent. More detailed description is given in this document.

About

Repository implementing Learnable Gradient Descent method for image restoration via recurrent neural networks (RNNs)

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published