Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Feature] Support propagation of in-place modifications to masked tensordict #132

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

riiswa
Copy link

@riiswa riiswa commented Dec 25, 2022

Description

This is an attempt to solve this issue: pytorch/rl#298

My goal is not to change the behaviour of getitem but just setitem. I have added a new test that matches the example of the issue.

I just have a problem with the test_nested_td_index test:

test/test_tensordict.py:1640 (TestTensorDicts.test_nested_td_index[device0-idx_td])
Traceback (most recent call last):
  File "/Users/waris/Projects/tensordict/test/test_tensordict.py", line 1669, in test_nested_td_index
    td["sub_td", "sub_sub_td"] = other_sub_sub_td
  File "/Users/waris/Projects/tensordict/tensordict/tensordict.py", line 1984, in __setitem__
    source.__setitem__(index, value)
  File "/Users/waris/Projects/tensordict/tensordict/tensordict.py", line 2012, in __setitem__
    self.set(index, value, inplace=isinstance(self, SubTensorDict))
  File "/Users/waris/Projects/tensordict/tensordict/tensordict.py", line 3512, in set
    return self.set_(key, tensor)
  File "/Users/waris/Projects/tensordict/tensordict/tensordict.py", line 3588, in set_
    self._source.set_at_(key, tensor, self.idx)
  File "/Users/waris/Projects/tensordict/tensordict/tensordict.py", line 2699, in set_at_
    _set_item(tensor_in, value, idx)
  File "/Users/waris/Projects/tensordict/tensordict/utils.py", line 640, in _set_item
    tensor[index] = value
  File "/Users/waris/Projects/tensordict/tensordict/tensordict.py", line 2028, in __setitem__
    raise RuntimeError(
RuntimeError: indexed destination TensorDict batch size is torch.Size([4, 3, 2, 1]) (batch_size = torch.Size([2, 4, 3, 2, 1]), index=(1,)), which differs from the source batch size torch.Size([4, 3, 2, 1, 2, 2])

I'm trying to figure out what the problem might be. Please let me know if this PR seems relevant :)

Motivation and Context

It will close pytorch/rl#298

  • I have raised an issue to propose this change (required for new features and bug fixes)

Types of changes

What types of changes does your code introduce? Remove all that do not apply:

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds core functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Documentation (update in the documentation)
  • Example (update in the folder of examples)

Checklist

Go over all the following points, and put an x in all the boxes that apply.
If you are unsure about any of these, don't hesitate to ask. We are here to help!

  • I have read the CONTRIBUTION guide (required)
  • My change requires a change to the documentation.
  • I have updated the tests accordingly (required for a bug fix or a new feature).
  • I have updated the documentation accordingly.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Dec 25, 2022
Copy link
Contributor

@vmoens vmoens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not super sure I understand how _sources work. Isn't it dangerous? Don't we risk an OOM error if we keep the source of the source of the source?

mask = torch.tensor([True, False, True])
x = torch.randn(2, 4, 2)
td[mask]["a"] = x
torch.testing.assert_allclose(td[mask]["a"], x)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is cool but it's the simplest use case. Maybe let's try with a couple of other masks?

@riiswa riiswa marked this pull request as draft December 30, 2022 02:48
@vmoens vmoens changed the title Support propagation of in-place modifications to masked tensordict [Feature] Support propagation of in-place modifications to masked tensordict Jan 5, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Feature Request] Support propagation of in-place modifications to masked tensordict to the parent tensordict.
3 participants