Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add ops.random.alpha_dropout and layers.AlphaDropout #18940

Merged
merged 2 commits into from Dec 15, 2023

Conversation

james77777778
Copy link
Contributor

Related to #18467

This PR should benefit the users who want to apply the self-normalizing property to their models.

The implementation is taken from:
https://github.com/keras-team/keras/blob/v2.14.0/keras/layers/regularization/alpha_dropout.py#L28-L104

@codecov-commenter
Copy link

codecov-commenter commented Dec 14, 2023

Codecov Report

All modified and coverable lines are covered by tests ✅

Comparison is base (e70f28f) 79.51% compared to head (7b1b3e8) 79.57%.
Report is 1 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master   #18940      +/-   ##
==========================================
+ Coverage   79.51%   79.57%   +0.05%     
==========================================
  Files         336      337       +1     
  Lines       34975    35056      +81     
  Branches     6865     6872       +7     
==========================================
+ Hits        27812    27896      +84     
+ Misses       5585     5583       -2     
+ Partials     1578     1577       -1     
Flag Coverage Δ
keras 79.43% <100.00%> (+0.05%) ⬆️
keras-jax 61.21% <100.00%> (+0.02%) ⬆️
keras-numpy 55.97% <86.36%> (-0.01%) ⬇️
keras-tensorflow 63.20% <100.00%> (+0.05%) ⬆️
keras-torch 63.85% <100.00%> (+0.04%) ⬆️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@@ -81,6 +82,24 @@ def dropout(inputs, rate, noise_shape=None, seed=None):
)


def alpha_dropout(inputs, rate, noise_shape=None, seed=None):
noise_shape = _get_concrete_noise_shape(inputs, noise_shape)
alpha = 1.6732632423543772848170429916717
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Any specific reason for having more than 6-7 decimal points?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

In practice it will be cast to float32 so some precision will be lost. The number above is just taken from the paper.

Copy link
Member

@fchollet fchollet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR! Ok to add the layer -- but let's drop the op.


def call(self, inputs, training=False):
if training and self.rate > 0:
return backend.random.alpha_dropout(
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can implement the layer in terms of backend ops and random.uniform. There is no real need for the alpha_dropout op -- only the layer would get used (even then, it's fairly niche usage).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sure, I've made the changes.

Copy link
Member

@fchollet fchollet left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thank you for the contribution!

@google-ml-butler google-ml-butler bot added kokoro:force-run ready to pull Ready to be merged into the codebase labels Dec 15, 2023
@fchollet fchollet merged commit a14af85 into keras-team:master Dec 15, 2023
6 checks passed
@google-ml-butler google-ml-butler bot removed awaiting review ready to pull Ready to be merged into the codebase kokoro:force-run labels Dec 15, 2023
@james77777778 james77777778 deleted the add-alpha-dropout branch December 15, 2023 07:44
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

5 participants