Extremely light-weight MixNet with Top-1 75.7% and 2.5M params
-
Updated
Oct 3, 2019 - Python
Extremely light-weight MixNet with Top-1 75.7% and 2.5M params
Stochastic Downsampling for Cost-Adjustable Inference and Improved Regularization in Convolutional Networks
Exploring Variational Deep Q Networks. A study undertaken for the University of Cambridge's R244 Computer Science Masters Course. Inspired by https://arxiv.org/abs/1711.11225/.
Concise, Modular, Human-friendly PyTorch implementation of EfficientNet with Pre-trained Weights.
Concise, Modular, Human-friendly PyTorch implementation of MixNet with Pre-trained Weights.
[ICCV 2019] Harmonious Bottleneck on Two Orthogonal Dimensions, surpassing MobileNetV2
Any-Precision Deep Neural Networks (AAAI 2021)
[ACL'20] HAT: Hardware-Aware Transformers for Efficient Natural Language Processing
NeurIPSCD2019, MicroNet Challenge hosted by Google, Deepmind Researcher, "Efficient Model for Image Classification With Regularization Tricks".
[CVPR 2019, Oral] HAQ: Hardware-Aware Automated Quantization with Mixed Precision
[ECCV 2018] AMC: AutoML for Model Compression and Acceleration on Mobile Devices
[JMLR'20] NeurIPS 2019 MicroNet Challenge Efficient Language Modeling, Champion
Efficient 3D Backbone Network for Temporal Modeling
S2-BNN: Bridging the Gap Between Self-Supervised Real and 1-bit Neural Networks via Guided Distribution Calibration (CVPR 2021)
[ICASSP'22] Integer-only Zero-shot Quantization for Efficient Speech Recognition
[MICCAI 2021] BiX-NAS: Searching Efficient Bi-directional Architecture for Medical Image Segmentation
Implementation of efficient backbones for computer vision task.
Melanoma Classification using Semi-supervised learning
[ICML'21 Oral] I-BERT: Integer-only BERT Quantization
[KDD'22] Learned Token Pruning for Transformers
Add a description, image, and links to the efficient-model topic page so that developers can more easily learn about it.
To associate your repository with the efficient-model topic, visit your repo's landing page and select "manage topics."