-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
strange image artifact in training and testing #140
Comments
Gans is unstable |
Same problem as #46 - it's a problem that haunts me since years - also pix2pix has it. I still have not found the cause, but once a model has caught these types of artifacts there seems to be no way to get rid of them, so you might want to restart the training again from scratch. |
I'm positive this kind of artifacts are caused by nn.Tanh that produces the final generator output. Maybe it is too complicated for the model to learn to utilize the full range of [-1, 1]. Or there are some misalignments in the input normalization methods used in this repo. Switching to nn.Sigmoid completely eliminates these artifacts for me. Of course, you need to modify some input normalization methods and im2tensor and tensor2im functions to accomodate for this change. |
你好,邮件已收到,祝你万事如意,生活愉快!
|
not sure whats happening here. i get these strange artifacts that appear during training and testing. i have tried changing learning rate with no effect. Any help would be much appreciated! Thanks.
The text was updated successfully, but these errors were encountered: