Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

deepcell.model_zoo.fpn.__create_semantic_head applies ReLU for n_channels=1 #624

Open
JLrumberger opened this issue Sep 12, 2022 · 0 comments
Labels
bug Something isn't working

Comments

@JLrumberger
Copy link

JLrumberger commented Sep 12, 2022

Describe the bug

deepcell.model_zoo.fpn.__create_semantic_head which is used by default in your segmentation models applies ReLU activation function on the output if n_classes == 1. I would expect that segmentation models with single channel output use sigmoid activation in their last layer. Furthermore, applying ReLU makes the logits unusable for binary_cross_entropy_with_logits since sigmoid(relu(x)) has a lower bound of 0.5.

To Reproduce

Something like the below code would return values bound between [0.5, 1]

x = tf.constant(np.random.rand(1,256,256,2), tf.float32)
model = PanopticNet(
    backbone="resnet50", input_shape=[256,256,2],
    norm_method="std", num_semantic_classes=[1],
)
out = model(x)
print(tf.reduce_min(tf.sigmoid(out)))

Expected behavior
Expected would be that the output is run through sigmoid, not ReLU.

Desktop (please complete the following information):

  • OS: Win 10
  • deepcell 0.12.3
  • Python 3.8.0

Thanks for maintaining this library :)

@JLrumberger JLrumberger added the bug Something isn't working label Sep 12, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant