LSTM, NLP task에 대한 lt hypothesis의 범용성을 검증하는 연구입니다.
-
Updated
Aug 20, 2020 - Jupyter Notebook
LSTM, NLP task에 대한 lt hypothesis의 범용성을 검증하는 연구입니다.
Reimplementation of Sparse Variational Dropout in Keras-Core/Keras 3.0
collection of works aiming at reducing model sizes or the ASIC/FPGA accelerator for machine learning
Counting currency from video using RepNet as a base model.
Code for the project "SNIP: Single-Shot Network Pruning"
Network acceleration methods
Sparse variational droput in tensorflow2
Channel-Prioritized Convolutional Neural Networks for Sparsity and Multi-fidelity
Pruning neural networks directly with back-propagation
The official code for our ACCV2022 poster paper: Network Pruning via Feature Shift Minimization.
[ICCV 2017] Learning Efficient Convolutional Networks through Network Slimming
This repository contains a Pytorch implementation of the article "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks" and an application of this hypothesis to reinforcement learning
Reducing the computational overhead of Deep CNNs through parameter pruning and tensor decomposition.
Pytorch implementation of our paper (TNNLS) -- Pruning Networks with Cross-Layer Ranking & k-Reciprocal Nearest Filters
Improved Implementation of Single Shot MultiBox Detector, RefineDet and Network Optimization in Pytorch 07/2018
💍 Efficient tensor decomposition-based filter pruning
Implementation of Autoslim using Tensorflow2
[NIPS 2016] Learning Structured Sparsity in Deep Neural Networks
Cheng-Hao Tu, Jia-Hong Lee, Yi-Ming Chan and Chu-Song Chen, "Pruning Depthwise Separable Convolutions for MobileNet Compression," International Joint Conference on Neural Networks, IJCNN 2020, July 2020.
Add a description, image, and links to the network-pruning topic page so that developers can more easily learn about it.
To associate your repository with the network-pruning topic, visit your repo's landing page and select "manage topics."