Skip to content
This repository has been archived by the owner on Mar 15, 2024. It is now read-only.

What are the hyperparameters for finetuning on cifar10/100, and other small datasets #175

Open
Phuoc-Hoan-Le opened this issue Jul 22, 2022 · 2 comments

Comments

@Phuoc-Hoan-Le
Copy link

Cannot find it in the paper? Is it the same as training on imagenet?

@TouvronHugo
Copy link
Contributor

Hi @CharlesLeeeee,
Thanks for your message.
No the hparams are not the same (see here for more details).
Best,
Hugo

@kevinkasa
Copy link

Hi @TouvronHugo, I noticed that those hyperparameters are from 2021, which I believe is pre DeiT III. I was wondering if you use the same finetuning hyperparameters for DeiT III, or if something else is used? Thank you!

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants