You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is a good practice when it comes to numerical stability, but maybe emphasizing this in the docs would be a good idea to prevent users from inserting a sigmoid as the final non-linearity in custom models. Maybe even raising a warning if we can somehow detect this.
The text was updated successfully, but these errors were encountered:
I noticed that minimax_discriminator_loss uses
binary_cross_entropy_with_logits
(cf.binary_cross_entropy
) which "combines a Sigmoid layer and the BCELoss" according to the docs.This is a good practice when it comes to numerical stability, but maybe emphasizing this in the docs would be a good idea to prevent users from inserting a sigmoid as the final non-linearity in custom models. Maybe even raising a warning if we can somehow detect this.
The text was updated successfully, but these errors were encountered: