Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Placement of dropout layers #1

Open
aalapshah12297 opened this issue May 9, 2019 · 0 comments
Open

Placement of dropout layers #1

aalapshah12297 opened this issue May 9, 2019 · 0 comments

Comments

@aalapshah12297
Copy link

From what I understand, droupout can only benefit the layers after it, since they must predict the same output with a subset of the input. In your implementation, dropout appears after the final conv layer, and is only succeeded by an upsampling and softmax layer (which do not have parameters to learn). It seems more logical to place the dropout layer(s) between the Classification layers i.e at line 110 and 114.

"""## Step 4: Classifier"""
classifier = tf.keras.layers.SeparableConv2D(128, (3, 3), padding='same', strides = (1, 1), name = 'DSConv1_classifier')(ff_final)
classifier = tf.keras.layers.BatchNormalization()(classifier)
classifier = tf.keras.activations.relu(classifier)
classifier = tf.keras.layers.SeparableConv2D(128, (3, 3), padding='same', strides = (1, 1), name = 'DSConv2_classifier')(classifier)
classifier = tf.keras.layers.BatchNormalization()(classifier)
classifier = tf.keras.activations.relu(classifier)
classifier = conv_block(classifier, 'conv', 19, (1, 1), strides=(1, 1), padding='same', relu=True)
classifier = tf.keras.layers.Dropout(0.3)(classifier)
classifier = tf.keras.layers.UpSampling2D((8, 8))(classifier)
classifier = tf.keras.activations.softmax(classifier)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant