Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Activation function of generator and discriminator? #27

Open
YongWookHa opened this issue Nov 28, 2018 · 0 comments
Open

Activation function of generator and discriminator? #27

YongWookHa opened this issue Nov 28, 2018 · 0 comments

Comments

@YongWookHa
Copy link

Hello.

In dcgan.py generator_model(), the model uses tanh for not only last layer but every layer.
Original paper said, it's recommended to use ReLU for every layer except the last one.
Same difference exsist in discriminator_model as well.

Do you have some specific reason that you build these differently?

Thank you for sharing your code, though.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant