Skip to content

sayakpaul/EvoNorms-in-TensorFlow-2

Repository files navigation

Presents implementations of EvoNormB0 and EvoNormS0 layers as proposed in Evolving Normalization-Activation Layers by Liu et al. The authors showed the results with these layers tested on MobileNetV2, ResNets, MnasNet, and EfficientNets. However, I tried a Mini Inception architecture as shown in this blog post with the CIFAR10 dataset.

Acknowledgements

TensorFlow version

2.2.0-rc3 (the version when I was testing the code on Colab)

About the files

  • Mini_Inception_BN_ReLU.ipynb: Shows a bunch of experiments with the Mini Inception architecture and BN-ReLU combination.
  • Mini_Inception_EvoNorm.ipynb: Shows implementations of EvoNormB0 and EvoNormS0 layers and experiments with the Mini Inception architecture.
  • Mini_Inception_EvoNorm_Sweep.ipynb: Does a hyperparameter search on the groups hyperparameter of EvoNormS0 layers along with a few other hyperparameters.
  • layer_utils: Ships EvoNormB0 and EvoNormS0 layers as stand-alone classes in tf.keras.

Experimental Summary

Follow experimental summary here.

References

About

Implements EvoNorms B0 and S0 as proposed in Evolving Normalization-Activation Layers.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published