Skip to content

model compression and optimization for deployment for Pytorch, including knowledge distillation, quantization and pruning.(知识蒸馏,量化,剪枝)

License

Notifications You must be signed in to change notification settings

HuangCongQing/model-compression-optimization

Repository files navigation

model-compression-optimization

model compression and optimization for deployment for Pytorch, including knowledge distillation, quantization and pruning.(知识蒸馏,量化,剪枝)

1 Pruning(剪枝)

算法总表

Pruning Method Code location Docs Remark
01开山之作:Learning Efficient Convolutional Networks Through Network Slimming (ICCV2017) code: pruning/01NetworkSlimming
code reference:
link1
link2
docs placeholder
02【ICCV2017】ThiNet code: 1pruning/02ThiNet
code reference:
https://github.com/SSriven/ThiNet
docs 1
03【CVPR2020】HRank code: 1pruning/03HRank
code reference:
link
docs placeholder
Coming... 1 1 1

01 Learning Efficient Convolutional Networks Through Network Slimming (ICCV2017)

docs: https://www.yuque.com/huangzhongqing/pytorch/iar4s1

code: pruning/01NetworkSlimming

code reference:

02 TODO

2 quantization(量化)

01 TODO

算法总表

量化 Method Code location Docs Remark
Coming... 1 1 1

3 knowledge distillation(知识蒸馏)

算法总表

KD Method Code location Docs Remark
01开山之作: Distilling the knowledge in a neural network(NIPS2014)ndom code: 3distillation/01Distilling the knowledge in a neural network
code reference: https://github.com/Eli-yu-first/Artificial_Intelligence
https://www.yuque.com/huangzhongqing/lightweight/lno6i7 1
02 Channel-wise Knowledge Distillation for Dense Prediction(ICCV2021) code: 3distillation/02SemSeg-distill
code reference: https://github.com/irfanICMLL/TorchDistiller/tree/main/SemSeg-distill
https://www.yuque.com/huangzhongqing/lightweight/dourdf2ogh9y1cx9#VHZBv 1
Coming... 1 1 1

01开山之作: Distilling the knowledge in a neural network(NIPS2014)

docs: https://www.yuque.com/huangzhongqing/lightweight/lno6i7

code: 3distillation/01Distilling the knowledge in a neural network

code reference: https://github.com/Eli-yu-first/Artificial_Intelligence

02 Channel-wise Knowledge Distillation for Dense Prediction(ICCV2021)

docs: https://www.yuque.com/huangzhongqing/lightweight/dourdf2ogh9y1cx9#VHZBv

code: 3distillation/02SemSeg-distill

code reference: https://github.com/irfanICMLL/TorchDistiller/tree/main/SemSeg-distill

4 NAS神经网络搜索(Neural Architecture Search,简称NAS)

video:

算法总表

NAS Method Code location Docs Remark
01 DARTS(ICLR'2019)【Differentiable Neural Architecture Search 可微分结构】—年轻人的第一个NAS模型 code: 4NAS/01DARTS(ICLR2019)/pt.darts
code reference:
https://github.com/khanrc/pt.darts
hthttps://www.yuque.com/huangzhongqing/lightweight/esyutcdebpmowgi3 video:【论文解读】Darts可微分神经网络架构搜索算法:https://www.bilibili.com/video/BV1Mm4y1R7Cw/?vd_source=617461d43c4542e4c5a3ed54434a0e55
Coming... 1 1 1

01 DARTS(ICLR'2019)【Differentiable Neural Architecture Search 可微分结构】—年轻人的第一个NAS模型

doc:https://www.yuque.com/huangzhongqing/lightweight/esyutcdebpmowgi3

code: 4NAS/01DARTS(ICLR2019)/pt.darts code reference::https://github.com/khanrc/pt.darts video:【论文解读】Darts可微分神经网络架构搜索算法:https://www.bilibili.com/video/BV1Mm4y1R7Cw/?vd_source=617461d43c4542e4c5a3ed54434a0e55

02 TODO

TODOlist

License

Copyright (c) 双愚. All rights reserved.

Licensed under the MIT License.


微信公众号:【双愚】(huang_chongqing) 聊科研技术,谈人生思考,欢迎关注~

image

往期推荐:

  1. 本文不提供职业建议,却能助你一生
  2. 聊聊我们大学生面试
  3. 清华大学刘知远:好的研究方法从哪来

About

model compression and optimization for deployment for Pytorch, including knowledge distillation, quantization and pruning.(知识蒸馏,量化,剪枝)

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published