Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Correction for "Awesome-Knowledge-Distillation" #23

Open
fahad92virgo opened this issue May 3, 2022 · 0 comments
Open

Correction for "Awesome-Knowledge-Distillation" #23

fahad92virgo opened this issue May 3, 2022 · 0 comments

Comments

@fahad92virgo
Copy link

Hi, Thank you for listing our paper "Knowledge Distillation Beyond Model Compression.", we really appreciate your effect.

I would like to request correcting the authorship of the paper, it should be Sarfraz et. al.

Also, our work isn't related to pruning and quantization, I believe a more relevant position for it would be in "beyond"

agupt013 pushed a commit to agupt013/Awesome-Knowledge-Distillation that referenced this issue May 23, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant