Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models
-
Updated
May 24, 2024 - Python
Libraries for applying sparsification recipes to neural networks with a few lines of code, enabling faster and smaller models
CS328 Introduction to Data Science - Prof. Anirban Dasgupta - Project: Sparsifying Networks while Preserving Properties
Sparsity-aware deep learning inference runtime for CPUs
Code for CRATE (Coding RAte reduction TransformEr).
The communication efficiency of federated learning is improved by sparsifying the parameters uploaded by the clients.
Feather is a module that enables effective sparsification of neural networks during training. This repository accompanies the paper "Feather: An Elegant Solution to Effective DNN Sparsification" (BMVC2023).
Sparsify Your Flux Models
An implementation and report of the twice Ramanujan graph sparsifiers.
A research library for pytorch-based neural network pruning, compression, and more.
(Unstructured) Weight Pruning via Adaptive Sparsity Loss
Complex-valued neural networks for pytorch and Variational Dropout for real and complex layers.
A simple C++14 and CUDA-based header-only library with tools for sparse-machine learning.
Repository to track the progress in model compression and acceleration
TensorFlow implementation of weight and unit pruning and sparsification
TensorFlow implementation of weight and unit pruning and sparsification
Add a description, image, and links to the sparsification topic page so that developers can more easily learn about it.
To associate your repository with the sparsification topic, visit your repo's landing page and select "manage topics."