💍 Efficient tensor decomposition-based filter pruning
-
Updated
May 21, 2024 - Jupyter Notebook
💍 Efficient tensor decomposition-based filter pruning
Awasome Papers and Resources in Deep Neural Network Pruning with Source Code.
Deepak Ghimire, Kilho Lee, and Seong-heum Kim, Loss-aware automatic selection of structured pruning criteria for deep neural network acceleration, Image and Vision Computing, vol. 136, p. 104745, 2023.
The framework to prune LLMs to any size and any config.
[NeurIPS 2023] Structural Pruning for Diffusion Models
[AAAI 2024] Fluctuation-based Adaptive Structured Pruning for Large Language Models
About Code for the paper "NASH: A Simple Unified Framework of Structured Pruning for Accelerating Encoder-Decoder Language Models" (EMNLP 2023 Findings)
This repository is the official implementation of the paper Pruning via Iterative Ranking of Sensitivity Statistics and implements novel pruning / compression algorithms for deep learning / neural networks. Amongst others it implements structured pruning before training, its actual parameter shrinking and unstructured before/during training.
Code for CHIP: CHannel Independence-based Pruning for Compact Neural Networks (NeruIPS 2021).
We have implemented a framework that supports developers to structured prune neural networks of Tensorflow Models
Code repository for paper "Efficient Structured Pruning and Architecture Searching for Group Convolution" https://arxiv.org/abs/1811.09341
Knowledge distillation from Ensembles of Iterative pruning (BMVC 2020)
Add a description, image, and links to the structured-pruning topic page so that developers can more easily learn about it.
To associate your repository with the structured-pruning topic, visit your repo's landing page and select "manage topics."