Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Potential bug in the loss function #5

Open
yygrechka opened this issue Sep 20, 2022 · 1 comment
Open

Potential bug in the loss function #5

yygrechka opened this issue Sep 20, 2022 · 1 comment

Comments

@yygrechka
Copy link

Why is Le being subtracted in the definition of Lc?
Is this a bug or just some clever optimization?

Lc = -torch.mean(torch.sum(F.log_softmax(labels_update, dim=1) * pred, dim=1)) - Le

@MohammedAlkhrashi
Copy link

I think this is the definition of the KL Divergence loss function: CrossEntropy (x,y)- Entropy(y), which is, I believe, what the author of the code is trying to do here

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants