Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NLL & Perplexity Loss #144

Open
lethienhoa opened this issue May 13, 2018 · 2 comments
Open

NLL & Perplexity Loss #144

lethienhoa opened this issue May 13, 2018 · 2 comments

Comments

@lethienhoa
Copy link

Hi,
It seems that Perplexity is normalized twice & norm_term of NLLLoss should be masked out as well.

@manujosephv
Copy link

Is this issue still open? I checked the code and didn't see the problems mentioned. Is it fixed?

@woaksths
Copy link

woaksths commented Sep 7, 2020

@lethienhoa
Yes, there need to be updated about NLLLoss norm term.
But I am also confused why loss is not divided in terms of norm_term before doing loss.backward()?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants