Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Setting discriminator updates to 1 causes grad_norm losses fail to update #229

Open
FirestName opened this issue May 11, 2023 · 0 comments

Comments

@FirestName
Copy link

Setting discriminator updates to 1 (from v2.gin) causes grad_norm losses not updating (at least) with wasserstein config.
The training still runs so no error messages produced.

Should this be the case?

Additionally would be interesting to test even more frequent discriminator updates as recommended with wasserstein gans (and the ilk I've understood), and as already setting them discriminator updates to 2 (from 4) appears benefiting the training.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant