Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The results cannot be replicated #62

Open
Yintel12138 opened this issue Mar 10, 2024 · 2 comments
Open

The results cannot be replicated #62

Yintel12138 opened this issue Mar 10, 2024 · 2 comments

Comments

@Yintel12138
Copy link

Title: Unable to Reproduce DKD Experiment Results on Tesla T4 Server Using Repository Code
Body:Dear maintainers,
I recently attempted to replicate the experiment results of Distilled Knowledge Distillation (DKD) mentioned in the paper, using your repository's code on my Tesla T4 server. Unfortunately, I was not able to achieve the same results as documented.
Could you please advise if there are any specific configurations or steps that I might have missed? Here is what I have done so far:
Set up the environment as per the documentation.
Pulled the latest code from the master branch of the repository.
Followed the instructions in the README file to set up the DKD experiment.
Ran the experiment with the default settings provided.
However, the results were significantly different from those reported in the paper. I would appreciate any guidance or recommendations to address this issue.
Thank you for your time and assistance.
Best regards,
Yintel
image
image
image

@Zzzzz1
Copy link
Collaborator

Zzzzz1 commented Mar 11, 2024

Did you run the experiment on 8 GPUs? The batch-size on each GPU could be very small if running on 8 GPUs. The results reported for CIFAR-100 is run on only 1 GPU.

@Yintel12138
Copy link
Author

Thank You. I will try it on only one gpu. Another question is, if I want to set a large batch size for boosting training speed, learning rate should increase or decrease?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants