Skip to content

Latest commit

 

History

History
21 lines (16 loc) · 1.85 KB

README.md

File metadata and controls

21 lines (16 loc) · 1.85 KB

Presents implementations of EvoNormB0 and EvoNormS0 layers as proposed in Evolving Normalization-Activation Layers by Liu et al. The authors showed the results with these layers tested on MobileNetV2, ResNets, MnasNet, and EfficientNets. However, I tried a Mini Inception architecture as shown in this blog post with the CIFAR10 dataset.

Acknowledgements

TensorFlow version

2.2.0-rc3 (the version when I was testing the code on Colab)

About the files

  • Mini_Inception_BN_ReLU.ipynb: Shows a bunch of experiments with the Mini Inception architecture and BN-ReLU combination.
  • Mini_Inception_EvoNorm.ipynb: Shows implementations of EvoNormB0 and EvoNormS0 layers and experiments with the Mini Inception architecture.
  • Mini_Inception_EvoNorm_Sweep.ipynb: Does a hyperparameter search on the groups hyperparameter of EvoNormS0 layers along with a few other hyperparameters.
  • layer_utils: Ships EvoNormB0 and EvoNormS0 layers as stand-alone classes in tf.keras.

Experimental Summary

Follow experimental summary here.

References