Neural networks compression via knowledge distillation technique
-
Updated
Sep 27, 2020 - Jupyter Notebook
Neural networks compression via knowledge distillation technique
Trained the ResNet50 model from scratch on the imagewoof dataset. Reached 83% accuracy
🐕 A Deep Learning multi-class classification project using ResNet family and ImageWoof dataset. [WIP]
Add a description, image, and links to the imagewoof topic page so that developers can more easily learn about it.
To associate your repository with the imagewoof topic, visit your repo's landing page and select "manage topics."