Skip to content

Official Code for NeurIPS 2022 Paper: How Mask Matters: Towards Theoretical Understandings of Masked Autoencoders

Notifications You must be signed in to change notification settings

zhangq327/U-MAE

Repository files navigation

U-MAE (Uniformity-enhanced Masked Autoencoder)

This repository includes a PyTorch implementation of the NeurIPS 2022 paper How Mask Matters: Towards Theoretical Understandings of Masked Autoencoders authored by Qi Zhang*, Yifei Wang*, and Yisen Wang.

U-MAE is an extension of MAE (He et al., 2022) by further encouraging the feature uniformity of MAE. As shown below, U-MAE successfully addresses the dimensional feature collapse issue of MAE.

Instructions

This repo is based on the official code of MAE with minor modifications below, and we follow all the default training and evaluation configurations of MAE. Please see their instructions README_mae.md for details.

Main differences. In U-MAE, we introduce a uniformity_loss (implemented in loss_func.py) as a uniformity regularization to the MAE loss. It also introduces an additional hyper-parameter lamb (default to 1e-2) in pretrain.sh, which represents the coefficient of the uniformity regularization in the U-MAE loss.

Minor points:

  1. We add a linear classifier to monitor the online linear accuracy and its gradient will not be backpropagated to the backbone encoder.
  2. For efficiency, we only train U-MAE for 200 epochs, and accordingly, we adopt 20 warmup epochs.

Citing this work

If you find the work useful, please cite the accompanying paper:

@inproceedings{zhang2022how,
  title={How Mask Matters: Towards Theoretical Understandings of Masked Autoencoders},
  author={Zhang, Qi and Wang, Yifei and Wang, Yisen},
  booktitle={NeurIPS},
  year={2022}
}

Acknowledgement

Our code follows the official implementations of MAE (https://github.com/facebookresearch/mae).

About

Official Code for NeurIPS 2022 Paper: How Mask Matters: Towards Theoretical Understandings of Masked Autoencoders

Resources

Code of conduct

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published