Skip to content

zjysteven/bitslice_sparsity

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 

Repository files navigation

Bit-slice Sparsity

This repo contains the codes for our preliminary study [paper][poster][presentation] which aims at improving bit-slice sparsity for efficient ReRAM deployment of DNN. Codes are tested with Pytorch 1.2.0 and Python 3.7.

The codes for MNIST and CIFAR-10 are within mnist/ and cifar/ respectively. The training routine mainly consists of three parts: pre-training, pruning, and fine-tuning.

First, pre-train a fixed-point model:

python pretrain.py

Then, load and prune the pre-trained model, and fine-tune with either normal l1 regularization, or bit-slice l1 regularization.

python finetune_l1.py or python finetune_bitslice.py

There are some arguments within the codes for which we have set up default values, but you may want to check it yourself and make some adjustments.

Acknowledgement

The codes are adapted from nics_fix_pytorch.

About

Codes for our paper "Exploring Bit-Slice Sparsity in Deep Neural Networks for Efficient ReRAM-Based Deployment" [NeurIPS'19 EMC2 workshop].

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages