Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

v1.3 batch_norm layer #16652

Closed
sddi opened this issue Feb 1, 2018 · 1 comment
Closed

v1.3 batch_norm layer #16652

sddi opened this issue Feb 1, 2018 · 1 comment

Comments

@sddi
Copy link

sddi commented Feb 1, 2018

I use the batch norm layer like this:
`def batch_norm_layer(x,train_phase,scope_bn):

bn_train = batch_norm(x, decay=0.999, center=True, scale=True,
is_training=True,
reuse=None, # is this right?
trainable=True,
scope=scope_bn)
bn_inference = batch_norm(x, decay=0.999, center=True, scale=True,
is_training=False,
reuse=True, # is this right?
trainable=True,
scope=scope_bn)
z = tf.cond(train_phase, lambda: bn_train, lambda: bn_inference)
return z`

I don't know in v1.3.0 is the code worked?
I saw the issue1122, someone said it would not work well.

thank you in advance.

@drpngx
Copy link
Contributor

drpngx commented Feb 2, 2018

This question is better asked on StackOverflow since it is not a bug or feature request. There is also a larger community that reads questions there. Thanks!

@drpngx drpngx closed this as completed Feb 2, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants