Skip to content
This repository has been archived by the owner on Apr 23, 2024. It is now read-only.

BBuf/oneflow-cifar

Repository files navigation

Train CIFAR10 with OneFlow

I'm playing with OneFlow on the CIFAR10 dataset.

Prerequisites

  • Python 3.6+
  • OneFlow 0.5.0rc+

Training

# Start training with: 
python main.py

# You can manually resume the training with: 
python main.py --resume --lr=0.01

Accuracy

Model Acc.
VGG16 93.92%
ResNet18 95.62%
ResNet50 95.40%
ResNet101
RegNetX_200MF 95.10%
RegNetY_400MF
MobileNetV2 92.56%
ResNeXt29(32x4d)
ResNeXt29(2x64d)
SimpleDLA
DenseNet121
PreActResNet18
DPN92
DLA

Quantization Aware Training

If you are interested in OneFlow FX feature, please do the following to compile OneFlow Experience FX.

git clone https://github.com/Oneflow-Inc/oneflow
cd oneflow
git checkout add_fx_intermediate_representation
mkdir build
cd build
cmake -DCUDNN_ROOT_DIR=/usr/local/cudnn -DCMAKE_BUILD_TYPE=Release -DTHIRD_PARTY_MIRROR=aliyun -DUSE_CLANG_FORMAT=ON -DTREAT_WARNINGS_AS_ERRORS=OFF ..
make -j32
# Start training with: 
python main_qat.py

# You can manually resume the training with: 
python main_qat.py --resume --lr=0.01

Note:

The momentum parameter in the MovingAverageMinMaxObserver class defaults to 0.95, which will not be changed in the following experiments.

Accuracy

Model quantization_bit quantization_scheme quantization_formula per_layer_quantization Acc
ResNet18 8 symmetric google True 95.19%
ResNet18 8 symmetric google False 95.24%
ResNet18 8 affine google True 95.32%
ResNet18 8 affine google False 95.30%
ResNet18 8 symmetric cambricon True 95.19%

Reference

TODO

add ddp train.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages