Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error Freezing Weights #1126

Open
mostafaelhoushi opened this issue Jun 5, 2023 · 0 comments
Open

Error Freezing Weights #1126

mostafaelhoushi opened this issue Jun 5, 2023 · 0 comments

Comments

@mostafaelhoushi
Copy link

I am trying to freeze weights of linear layers but I get this error:

module.weight.requires_grad = not freeze
RuntimeError: you can only change requires_grad flags of leaf variables. If you want to use a computed variable in a subgraph that doesn't require differentiation use var_no_grad = var.detach().

and in my case the module is ColumnParallelLinear whose weight does seem to be a leaf variable:

self.weight = Parameter(torch.Tensor(self.output_size_per_partition, self.in_features))

so I am not sure why I am getting this error message.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant