[NeurIPS 2023] Structural Pruning for Diffusion Models
-
Updated
Jan 16, 2024 - Python
[NeurIPS 2023] Structural Pruning for Diffusion Models
The framework to prune LLMs to any size and any config.
Awasome Papers and Resources in Deep Neural Network Pruning with Source Code.
Code for CHIP: CHannel Independence-based Pruning for Compact Neural Networks (NeruIPS 2021).
This repository is the official implementation of the paper Pruning via Iterative Ranking of Sensitivity Statistics and implements novel pruning / compression algorithms for deep learning / neural networks. Amongst others it implements structured pruning before training, its actual parameter shrinking and unstructured before/during training.
Knowledge distillation from Ensembles of Iterative pruning (BMVC 2020)
We have implemented a framework that supports developers to structured prune neural networks of Tensorflow Models
[AAAI 2024] Fluctuation-based Adaptive Structured Pruning for Large Language Models
About Code for the paper "NASH: A Simple Unified Framework of Structured Pruning for Accelerating Encoder-Decoder Language Models" (EMNLP 2023 Findings)
💍 Efficient tensor-based filter pruning
Code repository for paper "Efficient Structured Pruning and Architecture Searching for Group Convolution" https://arxiv.org/abs/1811.09341
Deepak Ghimire, Kilho Lee, and Seong-heum Kim, Loss-aware automatic selection of structured pruning criteria for deep neural network acceleration, Image and Vision Computing, vol. 136, p. 104745, 2023.
Add a description, image, and links to the structured-pruning topic page so that developers can more easily learn about it.
To associate your repository with the structured-pruning topic, visit your repo's landing page and select "manage topics."