We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
assert not x.requires_grad
Any reason why requires_grad needs to be False for replacing a tensor? https://github.com/tinygrad/tinygrad/blob/master/tinygrad/tensor.py#L143
Here is my use case which doesn't work. It's quite common in RL to copy weights to target networks.
from tinygrad import nn class Network: def __init__(self): self.test = nn.Linear(10, 1) net = Network() target_net = Network() tau = 0.9 for param, target_param in zip(nn.state.get_parameters(net), nn.state.get_parameters(target_net)): target_param.replace(tau * param + (1 - tau) * target_param)
The text was updated successfully, but these errors were encountered:
Unless there are any other functions that does the same thing as torch.tensor.copy_
Sorry, something went wrong.
check Tensor.assign and for example the usage in optimizers
Tensor.assign
tinygrad/tinygrad/nn/optim.py
Line 40 in 0c197b9
oh assign currently checks not required_grad too, maybe it's okay to remove
On it
Successfully merging a pull request may close this issue.
Any reason why requires_grad needs to be False for replacing a tensor?
https://github.com/tinygrad/tinygrad/blob/master/tinygrad/tensor.py#L143
Here is my use case which doesn't work. It's quite common in RL to copy weights to target networks.
The text was updated successfully, but these errors were encountered: