Pruning neural networks directly with back-propagation
-
Updated
Jul 21, 2021 - Python
Pruning neural networks directly with back-propagation
This repository contains a Pytorch implementation of the article "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks" and an application of this hypothesis to reinforcement learning
LSTM, NLP task에 대한 lt hypothesis의 범용성을 검증하는 연구입니다.
Reimplementation of Sparse Variational Dropout in Keras-Core/Keras 3.0
collection of works aiming at reducing model sizes or the ASIC/FPGA accelerator for machine learning
Counting currency from video using RepNet as a base model.
Code for the project "SNIP: Single-Shot Network Pruning"
Network acceleration methods
The official code for our ACCV2022 poster paper: Network Pruning via Feature Shift Minimization.
Implementation of Autoslim using Tensorflow2
[ICCV 2017] Learning Efficient Convolutional Networks through Network Slimming
[ICLR'23] Trainability Preserving Neural Pruning (PyTorch)
Sparse variational droput in tensorflow2
Pytorch implementation of our paper (TNNLS) -- Pruning Networks with Cross-Layer Ranking & k-Reciprocal Nearest Filters
[Preprint] Why is the State of Neural Network Pruning so Confusing? On the Fairness, Comparison Setup, and Trainability in Network Pruning
Channel-Prioritized Convolutional Neural Networks for Sparsity and Multi-fidelity
Reducing the computational overhead of Deep CNNs through parameter pruning and tensor decomposition.
Cheng-Hao Tu, Jia-Hong Lee, Yi-Ming Chan and Chu-Song Chen, "Pruning Depthwise Separable Convolutions for MobileNet Compression," International Joint Conference on Neural Networks, IJCNN 2020, July 2020.
💍 Efficient tensor decomposition-based filter pruning
Add a description, image, and links to the network-pruning topic page so that developers can more easily learn about it.
To associate your repository with the network-pruning topic, visit your repo's landing page and select "manage topics."