Skip to content
gabfalcao edited this page Jan 15, 2023 · 7 revisions

Welcome to the RedBit documentation. Here you can find additional information about the RedBit framework implementation and results obtained with it, so far.

CNN Architectures

We have implemented multiple CNNs. We present the architectures of said CNNs, but only the baseline model. Each quantization method may introduce modifications to each CNN architecture. For more details, see the code implementation of each model in a specific quantization method.

We have implemented LeNet-5 to be trained with MNIST.

We also implemented ResNet-20, ResNet-50, and VGG-16 to be trained with CIFAR-10.

Finally, to be trained with ImageNet, we implemented AlexNet, ResNet-18, and VGG-16.

Quantization Results

We gathered a significant amount of test results across 3 datasets, MNIST, CIFAR-10, and ImageNet. Each test was conducted after finding out which combination of optimizer and initial learning rate was the optimal choice to achieve the best result. This search was conducted through smaller tests of 30 epochs for MNIST, 20 epochs for CIFAR-10, and 10 epochs for ImageNet related CNNs.

All tables show the final results we gathered, with the respective hyperparameters used to train the respective CNN.

For MNIST, we collected quantization results with LeNet-5.

For CIFAR-10, we collected quantization results for ResNet-20, ResNet-50, and VGG-16.

Finally, for ImageNet we collected quantization results for AlexNet, ResNet-18, and VGG-16.