Skip to content

Training neural nets with quantized weights on arbitrarily specified bit-depth

Notifications You must be signed in to change notification settings

stracini-git/qnn

Repository files navigation

Quantization Schemes For Training Neural Networks

This work shows how to train neural networks with quantized weights directly via backpropagation. The code quantizes weights as well as activations. It runs LeNet-300 on MNIST and ResNet18 on CIFAR10 datasets. More details in this pdf

Run Experiment

--wbits specifies the number of bits for weights
--abits specifies the number of bits for activations

python Trainer.py --wbits 4 --abits 4

Results