Autodistill is an ecosystem for using big, slower foundation models to train small, faster supervised models. Using autodistill
and its associated packages, you can go from unlabeled images to inference on a custom model running at the edge with no human intervention in between.
Autodistill
Use bigger slower models to train smaller faster ones
Pinned
Repositories
Showing 10 of 53 repositories
-
-
-
-
- autodistill Public
Images to inference with no labeling (use foundation models to train supervised models).
-
-
-
-
- autodistill-efficient-yolo-world Public
EfficientSAM + YOLO World base model for use with Autodistill.