Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question regarding symmetric KL Loss #145

Open
skbaur opened this issue Nov 30, 2023 · 0 comments
Open

Question regarding symmetric KL Loss #145

skbaur opened this issue Nov 30, 2023 · 0 comments

Comments

@skbaur
Copy link

skbaur commented Nov 30, 2023

The way the symmetric KL Loss is implemented here (for sift loss)

def symmetric_kl(logits, target):
differs from the symmetrized Kullback Leiber divergence, in particular it is not zero when both inputs are equal as would be expected (see https://en.wikipedia.org/wiki/Kullback–Leibler_divergence#Symmetrised_divergence ). In fact it seems to be equal to twice the entropy in that case (when the inputs are equal), which would intuitively lead to predictions of higher confidence.

Other implementations, see e.g., https://github.com/archinetai/smart-pytorch/blob/e96d8630dc58e1dce8540f61f91016849925ebfe/smart_pytorch/loss.py#L10, behave more like I would have expected it (from the name). Is there a reason to deviate from the more standard definition?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant