Skip to content

Early stages of incorporating self-supervised with algorithm unrolling. Code was written as part of a master's thesis (60 ECTS) at Aalborg University, Denmark.

License

Notifications You must be signed in to change notification settings

LarsenAndreas/SSL_ISTA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

13 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Combining Algorithm Unrolling with Self-Supervised Learning

This project explores the early stages of incorporating self-supervised with algorithm unrolling. Code was written as part of a mathematical-engineering master's thesis (60 ECTS) @ Aalborg University, Denmark.

The implementation of ISTA-Net and MAE (with ViT) is based on the paper "ISTA-Net: Interpretable Optimization-Inspired Deep Network for Image Compressive Sensing" and "Masked Autoencoders Are Scalable Vision Learners" respectively.

Requirements

Created using Python 3.10.6. See requirements.txt for further details.

Usage

ISTA-Net and ista2vec

The script is designed such that variables are changed directly in the code. The training.py script builds and executes the training loop -- Just provide a list image paths.

ISTA-MAE and MAE

The script is designed such that parameters are set in a parameter file, parameters.py or parameters_finetuning.py for pre-training and fine-tuning respectively. The path to the data needs to be specified in the parameter file as well as a model path for fine-tuning. After setting parameters run one of the training files:

  • pre-train_istamae.py
  • pre-train_mae.py
  • train_sr_istamae.py
  • train_sr_mae.py

Credit

Inspired by the original PyTorch implementation "ISTA-Net-PyTorch" by Jian Zhang and "Masked Autoencoders: A PyTorch Implementation" by Xinlei Chen and Kaiming He.

@mastersthesis{jonhardsson2023,
    author       = {Jónhardsson, Magnus and Jørgensen, Mads and Larsen, Andreas},
    school       = {Aalborg University},
    title        = {Combining Algorithm Unrolling with Self-Supervised Learning for Compressed Sensing Image Super-Resolution},
    year         = {2023}
}

About

Early stages of incorporating self-supervised with algorithm unrolling. Code was written as part of a master's thesis (60 ECTS) at Aalborg University, Denmark.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages