Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

assert not x.requires_grad in Tensor.replace()? #3879

Open
eugeneteoh opened this issue Mar 22, 2024 · 3 comments
Open

assert not x.requires_grad in Tensor.replace()? #3879

eugeneteoh opened this issue Mar 22, 2024 · 3 comments

Comments

@eugeneteoh
Copy link

Any reason why requires_grad needs to be False for replacing a tensor?
https://github.com/tinygrad/tinygrad/blob/master/tinygrad/tensor.py#L143

Here is my use case which doesn't work. It's quite common in RL to copy weights to target networks.

from tinygrad import nn

class Network:
    def __init__(self):
        self.test = nn.Linear(10, 1)

net = Network()
target_net = Network()
tau = 0.9

for param, target_param in zip(nn.state.get_parameters(net), nn.state.get_parameters(target_net)):
    target_param.replace(tau * param + (1 - tau) * target_param)
@eugeneteoh
Copy link
Author

Unless there are any other functions that does the same thing as torch.tensor.copy_

@chenyuxyz
Copy link
Collaborator

chenyuxyz commented Mar 22, 2024

check Tensor.assign and for example the usage in optimizers

self.b[i].assign(self.momentum * self.b[i] + g) # NOTE: self.b[i] is zero on the first run, no if required

oh assign currently checks not required_grad too, maybe it's okay to remove

@eugeneteoh
Copy link
Author

On it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants