Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove requires_grad check for replace and assign #3880

Closed
wants to merge 5 commits into from

Conversation

eugeneteoh
Copy link

@eugeneteoh eugeneteoh commented Mar 22, 2024

fix #3879

@geohot
Copy link
Collaborator

geohot commented Mar 22, 2024

Can you add tests confirming that this is okay to do, and things behave as expected?

Copy link
Contributor

This branch currently is behind tinygrad/master. The line count difference bot is disabled.

@eugeneteoh
Copy link
Author

Added some simple tests. Let me know if you have better ideas as I can't think of any now.

Also, I don't fully understand what .assign() does, how it's different from .replace(), and why they have to be separate.

@geohot
Copy link
Collaborator

geohot commented Mar 22, 2024

I mean test that gradient behaves as expected, not that the flag is right!

@eugeneteoh
Copy link
Author

replace/assign seems to work differently from torch.Tensor.copy_ (see my test). All the optimiser tests are failing when requires_grad is set, as they use assign. Should replace/assign be separate?

@chenyuxyz
Copy link
Collaborator

closing as stale

@chenyuxyz chenyuxyz closed this Apr 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

assert not x.requires_grad in Tensor.replace()?
3 participants