Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Another question about "Equal" linear #356

Open
zhzhzoo-autra opened this issue May 15, 2024 · 3 comments
Open

Another question about "Equal" linear #356

zhzhzoo-autra opened this issue May 15, 2024 · 3 comments

Comments

@zhzhzoo-autra
Copy link

zhzhzoo-autra commented May 15, 2024

Hi, thanks for releasing this reference implementation for stylegan!
I have a question about EqualLinear. In stylegan codebase (https://github.com/rosinality/style-based-gan-pytorch/blob/07fa60be77b093dd13a46597138df409ffc3b9bc/model.py#L203) there's an explicit "equal_lr" operation. But in this codebase I can't find any similar code. Does it mean that I don't need to adjust the learning rate dynamically in training, only to set different learning rates for different layers at the very beginning is sufficient? Thanks!

@ken881015
Copy link

In model.py, there are two class for equalize learning rate, EqualConv2d and EqualLinear

@zhzhzoo-autra
Copy link
Author

In model.py, there are two class for equalize learning rate, EqualConv2d and EqualLinear

In EqualConv2d, self.scale and self.lr_mul seem to affect learning rate. But I can't find any code adjusting them during the traing process.

@ken881015
Copy link

Yeah, but as you can see, the weight for convolution calculation is self.weight * self.scale.
In my opinion, It can be said that we use the gradient from "self.weight * self.scale" to update self.weight.

As you already known that EqualXX is designed due to different scale of weights layer-wise, the purpose of using "self.weight * self.scale" is actually scaling the gradient! (w' = w - lr * pL/pw * scale). above is how I realize, but not sure about correctness....

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants