Implementation of artcile "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks"
-
Updated
Jun 12, 2019 - Jupyter Notebook
Implementation of artcile "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks"
Github page for SSDFA
Neural Network Sparsification via Pruning
Code for "Variational Depth Search in ResNets" (https://arxiv.org/abs/2002.02797)
Implementation for the paper "SpaceNet: Make Free Space For Continual Learning" in PyTorch.
Simple C++ implementation of a sparsely connected multi-layer neural network using OpenMP and CUDA for parallelization.
Code for testing DCT plus Sparse (DCTpS) networks
Always sparse. Never dense. But never say never. A Sparse Training repository for the Adaptive Sparse Connectivity concept and its algorithmic instantiation, i.e. Sparse Evolutionary Training, to boost Deep Learning scalability on various aspects (e.g. memory and computational time efficiency, representation and generalization power).
A neural net with a terminal-based testing program.
[ICLR 2022] "Learning Pruning-Friendly Networks via Frank-Wolfe: One-Shot, Any-Sparsity, and No Retraining" by Lu Miao*, Xiaolong Luo*, Tianlong Chen, Wuyang Chen, Dong Liu, Zhangyang Wang
[ICLR 2022] "Peek-a-Boo: What (More) is Disguised in a Randomly Weighted Neural Network, and How to Find It Efficiently", by Xiaohan Chen, Jason Zhang and Zhangyang Wang.
Robustness of Sparse Multilayer Perceptrons for Supervised Feature Selection
[IJCAI 2022] "Dynamic Sparse Training for Deep Reinforcement Learning" by Ghada Sokar, Elena Mocanu , Decebal Constantin Mocanu, Mykola Pechenizkiy, and Peter Stone.
PyTorch Implementation of TopKAST
[TMLR] Supervised Feature Selection with Neuron Evolution in Sparse Neural Networks
This is the repository for the SNN-22 Workshop paper on "Generalization and Memorization in Sparse Neural Networks".
[Machine Learning Journal (ECML-PKDD 2022 journal track)] A Brain-inspired Algorithm for Training Highly Sparse Neural Networks
Demo code for CVPR2023 paper "Sparsifiner: Learning Sparse Instance-Dependent Attention for Efficient Vision Transformers"
[ICLR 2023] "Sparsity May Cry: Let Us Fail (Current) Sparse Neural Networks Together!" Shiwei Liu, Tianlong Chen, Zhenyu Zhang, Xuxi Chen, Tianjin Huang, AJAY KUMAR JAISWAL, Zhangyang Wang
Add a description, image, and links to the sparse-neural-networks topic page so that developers can more easily learn about it.
To associate your repository with the sparse-neural-networks topic, visit your repo's landing page and select "manage topics."