Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Light spot in trainning. #46

Open
594cp opened this issue Jul 23, 2018 · 9 comments
Open

Light spot in trainning. #46

594cp opened this issue Jul 23, 2018 · 9 comments

Comments

@594cp
Copy link

594cp commented Jul 23, 2018

Hi,I have a problem in my trainning.It works well when I train 256x256 images using train_A to train_B,but when I train 512x1024 images using Label to image without instance map,I have a light spot in every generated images,what's wrong?

-1

the red rectangle shows the light spot position.

-2
Anyone has the same problems?

@cientgu
Copy link

cientgu commented Aug 7, 2018

@594cp i meet the same problem during training, but it does not appear in test time, i have seen some of this light spot position in style transfer tasks, but i don't know why it happens.

@tcwang0509
Copy link
Contributor

As addressed by @Quasimondo, this problem can be alleviated if you replace all zero paddings in the global generator with reflection paddings.

@Quasimondo
Copy link

I should add that changing the padding does not solve the artifact problem entirely, but it seems that there are less of them. Overall this is indeed a problem that bothers me quite a bit since once a model has "caught" these artifacts (which I believe originate from the residual layers) they seems to hamper the learning process considerably since they bias the normalization. You might have noticed that frames that show these artifacts are usually quite flat and without contrast in the other areas. Also further training does usually just increase the amount of artifacts.

One experimental approach to "heal" a model that has caught them is to increase the learning rate a lot, e.g. 0.0008 instead of 0.0002. But then it is important to observe the training results and stopping the process once the artifacts seem to have disappeared. Of course this is a dangerous path and you should keep a backup of your model since if the gradients explode you model will be ruined.

@ksteinfe
Copy link

Same issue as described above.

@tcwang0509, would you mind detailing how to go about replacing the zero paddings with reflection paddings?

I see that the default padding_type argument on of the __init__ method of the GlobalGenerator class is already "reflect". Does that default setting a result of a change that has been made since the start of this thread?

class GlobalGenerator(nn.Module):
    def __init__(self, input_nc, output_nc, ngf=64, n_downsampling=3, n_blocks=9, norm_layer=nn.BatchNorm2d, 
                 padding_type='reflect'):

Or are changes needed on the following lines, which call nn.Conv2d with padding=0?

line 192
model = [nn.ReflectionPad2d(3), nn.Conv2d(input_nc, ngf, kernel_size=7, padding=0), norm_layer(ngf), activation]
line 209
model += [nn.ReflectionPad2d(3), nn.Conv2d(ngf, output_nc, kernel_size=7, padding=0), nn.Tanh()]

Thank you for your help!

@Quasimondo
Copy link

Quasimondo commented Feb 20, 2019 via email

@ksteinfe
Copy link

Like this?
starting on line 193...

        ### downsample
        for i in range(n_downsampling):
            mult = 2**i
            model += [nn.ReflectionPad2d(3), nn.Conv2d(ngf * mult, ngf * mult * 2, kernel_size=3, stride=2, padding=0),
                      norm_layer(ngf * mult * 2), activation]

@Quasimondo
Copy link

Quasimondo commented Mar 12, 2019 via email

@aviel08
Copy link

aviel08 commented Aug 20, 2019

Hey @ksteinfe are you sure that code is right? it gives me an error when I try it. I changed the reflection type to 1 and it works for me.
I think the order is:
1=Reflect
2=Replicate
3=zero

   model += [nn.ReflectionPad2d(1), nn.Conv2d(ngf * mult, ngf * mult * 2, kernel_size=3, 

@hackgoofer
Copy link

@aviel08 the input is just the padding size: https://pytorch.org/docs/master/generated/torch.nn.ReflectionPad2d.html

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants