ptdeco is a library for model optimization by decomposition built on top of PyTorch
-
Updated
May 24, 2024 - Python
ptdeco is a library for model optimization by decomposition built on top of PyTorch
This repo contains model compression(using TensorRT) and documentation of running various deep learning models on NVIDIA Jetson Orin, Nano (aarch64 architectures)
Gather research papers, corresponding codes (if having), reading notes and any other related materials about Hot🔥🔥🔥 fields in Computer Vision based on Deep Learning.
Awesome Knowledge Distillation
A curated list of awesome NLP, Computer Vision, Model Compression, XAI, Reinforcement Learning, Security etc Paper
Characterization study repository for model compression method: pruning
TinyNeuralNetwork is an efficient and easy-to-use deep learning model compression framework.
Hyperparameter Tuning with Microsoft NNI to automated machine learning (AutoML) experiments. The tool dispatches and runs trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in different environments like local machine, remote servers and cloud.
[CVPR 2023] Towards Any Structural Pruning; LLMs / SAM / Diffusion / Transformers / YOLOv8 / CNNs
[CVPR 2024 Highlight] Logit Standardization in Knowledge Distillation
Resources of our survey paper "Enabling AI on Edges: Techniques, Applications and Challenges"
模型压缩的小白入门教程
a collection of computer vision projects&tools. 计算机视觉方向项目和工具集合。
An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model compression and hyper-parameter tuning.
OpenBA-V2: 3B LLM (Large Language Model) with T5 architecture, utilizing model pruning technique and continuing pretraining from OpenBA-15B.
Awesome machine learning model compression research papers, tools, and learning material.
Efficient AI Backbones including GhostNet, TNT and MLP, developed by Huawei Noah's Ark Lab.
OTOv1-v3, NeurIPS, ICLR, TMLR, DNN Training, Compression, Structured Pruning, Erasing Operators, CNN, Diffusion, LLM
The Truth Is In There: Improving Reasoning in Language Models with Layer-Selective Rank Reduction
A Comparative Analysis of Sound Data Pre-processing and Deep Learning Model Compression Techniques: A Study on Forest Sound Classification
Add a description, image, and links to the model-compression topic page so that developers can more easily learn about it.
To associate your repository with the model-compression topic, visit your repo's landing page and select "manage topics."