Skip to content

Latest commit

 

History

History
17 lines (14 loc) · 1.15 KB

README.md

File metadata and controls

17 lines (14 loc) · 1.15 KB

Bit-slice Sparsity

This repo contains the codes for our preliminary study [paper][poster][presentation] which aims at improving bit-slice sparsity for efficient ReRAM deployment of DNN. Codes are tested with Pytorch 1.2.0 and Python 3.7.

The codes for MNIST and CIFAR-10 are within mnist/ and cifar/ respectively. The training routine mainly consists of three parts: pre-training, pruning, and fine-tuning.

First, pre-train a fixed-point model:

python pretrain.py

Then, load and prune the pre-trained model, and fine-tune with either normal l1 regularization, or bit-slice l1 regularization.

python finetune_l1.py or python finetune_bitslice.py

There are some arguments within the codes for which we have set up default values, but you may want to check it yourself and make some adjustments.

Acknowledgement

The codes are adapted from nics_fix_pytorch.