Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Binary Cross Entropy instead of MSE #10

Open
JanoschMenke opened this issue Oct 21, 2019 · 1 comment
Open

Binary Cross Entropy instead of MSE #10

JanoschMenke opened this issue Oct 21, 2019 · 1 comment

Comments

@JanoschMenke
Copy link

Hi,

maybe a quite simple question but in your Regression example you pass the

'nll_func': <function neuralfingerprint.util.rmse as the nll_func via build_conv_deep_net to the build_standard_net function. However, I am not sure how the utils.rmse relates to the mean_squared_error function that is being used in the build_standard_net.

My goal is to adapt the example code so that I can do a binary classification.
I tried to replace the default loss from build_standard_net with the binary_cross_entropy. But I think I am missing something because the results do not make sense:

@duvenaud
Copy link
Contributor

There might be a bug in that binary_cross_entropy method, try using the code from here:

https://github.com/HIPS/autograd/blob/master/examples/variational_autoencoder.py#L26

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants