Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Possibly incorrect implementation of Batch Renorm #14

Open
waleedka opened this issue Aug 3, 2017 · 1 comment
Open

Possibly incorrect implementation of Batch Renorm #14

waleedka opened this issue Aug 3, 2017 · 1 comment

Comments

@waleedka
Copy link

waleedka commented Aug 3, 2017

I came across your implementation of batch re-normalization in the BatchReNormDNNLayer class, and I think there might be an error that might be affecting the model's performance.

My understanding of batch re-norm is that it applies the standard BN normalization first, then applies the r/d correction, and then finally applies the gamma/beta scaling and bias. Something along the lines of this:

normed_x = (x - batch_mean) / batch_std    # standard BN
normed_x = normed_x * r + d                # The batch renorm correction
normed_x = normed_x * gamma + beta         # final scale and bias

However, this line is applying the r/d correction after the scaling and centering with gamma and beta.
https://github.com/ajbrock/Neural-Photo-Editor/blob/master/layers.py#L128

It probably works anyway, based on the good results you seem to have gotten. I just thought I'd bring it to your attention.

@ajbrock
Copy link
Owner

ajbrock commented Aug 4, 2017

Thanks, this is definitely a discrepancy, likely born of the fact that I threw this layer together in a couple minutes right after seeing the paper. Also because cuDNN's batchnorm does the scale+shift automatically. If you want to submit a pull request to fix it (just moving the gamma+beta outside the cuDNN bnorm call would do it) feel free, though I'm not really supporting much of the code in this repo thse days <_<

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants