You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think this is the definition of the KL Divergence loss function: CrossEntropy (x,y)- Entropy(y), which is, I believe, what the author of the code is trying to do here
Why is Le being subtracted in the definition of Lc?
Is this a bug or just some clever optimization?
PENCIL/pencil_train.py
Line 244 in a1b65d2
The text was updated successfully, but these errors were encountered: