Skip to content

Transformers as Meta-Learners for Implicit Neural Representations, in ECCV 2022

License

Notifications You must be signed in to change notification settings

yinboc/trans-inr

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Trans-INR

This repository contains the official implementation for the following paper:

Transformers as Meta-Learners for Implicit Neural Representations
Yinbo Chen, Xiaolong Wang
ECCV 2022

Project page: https://yinboc.github.io/trans-inr/.

@inproceedings{chen2022transinr,
  title={Transformers as Meta-Learners for Implicit Neural Representations},
  author={Chen, Yinbo and Wang, Xiaolong},
  booktitle={European Conference on Computer Vision},
  year={2022},
}

Reproducing Experiments

Environment

  • Python 3
  • Pytorch 1.12.0
  • pyyaml numpy tqdm imageio TensorboardX wandb einops

Data

mkdir data and put different dataset folders in it.

  • CelebA: download (from kaggle), extract, and rename the folder as celeba (so that images are in data/celeba/img_align_celeba/img_align_celeba).

  • Imagenette: download, extract, and rename the folder as imagenette.

  • View synthesis: download from google drive (provided by learnit) and put them in a folder named learnit_shapenet, unzip the category folders and rename them as chairs, cars, lamps correspondingly.

Training

Run CUDA_VISIBLE_DEVICES=[GPU] python run_trainer.py --cfg [CONFIG], configs are in cfgs/.

To enable wandb, complete wandb.yaml (in root) and add -w to the training command.

When running multiple multi-gpu training processes, specify -p with different values (0,1,2...) for different ports.

Evaluation

For image reconstruction, test PSNR is automatically evaluated in the training script.

For view synthesis, run in a single GPU with configs in cfgs/nvs_eval. To enable test-time optimization, uncomment (remove #) tto_steps in configs.

Releases

No releases published

Packages

No packages published

Languages