You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
TensorDict does not work when torch.distributed is not available.
To Reproduce
Take a machine where torch.distributed.is_available() equals to False. Then import TensorDict
fromtensordictimportTensorDict
then it gives the following traceback:
Traceback (most recent call last):
AttributeError Traceback (most recent call last)
Cell In[2], line 1
----> 1 from tensordict import TensorDict
2 import torch
3 a = torch.rand(3, 4)
File /opt/homebrew/anaconda3/envs/aaad-poc/lib/python3.9/site-packages/tensordict/__init__.py:7
1 # Copyright (c) Meta Platforms, Inc. and affiliates.
2 #
3 # This source code is licensed under the MIT license found in the
4 # LICENSE file in the root directory of this source tree.
6 from tensordict.memmap import MemmapTensor, set_transfer_ownership
----> 7 from tensordict.persistent import PersistentTensorDict
8 from tensordict.tensorclass import is_tensorclass, tensorclass
9 from tensordict.tensordict import (
10 is_batchedtensor,
11 is_memmap,
(...)
20 TensorDictBase,
21 )
File /opt/homebrew/anaconda3/envs/aaad-poc/lib/python3.9/site-packages/tensordict/persistent.py:27
24 import torch
26 from tensordict import MemmapTensor
---> 27 from tensordict.tensordict import (
28 _TensorDictKeysView,
29 CompatibleType,
30 is_tensor_collection,
31 NO_DEFAULT,
32 TensorDict,
33 TensorDictBase,
34 )
35 from tensordict.utils import (
36 _shape,
37 DeviceType,
(...)
41 NUMPY_TO_TORCH_DTYPE_DICT,
42 )
45 class _Visitor:
File /opt/homebrew/anaconda3/envs/aaad-poc/lib/python3.9/site-packages/tensordict/tensordict.py:317
312 return fn(*args, **kwargs)
314 return wrapper
--> 317 class TensorDictBase(MutableMapping):
318 """TensorDictBase is an abstract parent class for TensorDicts, a torch.Tensor data container."""
320 LOCK_ERROR = (
321 "Cannot modify locked TensorDict. For in-place modification, consider "
322 "using the `set_()` method and make sure the key is present."
323 )
File /opt/homebrew/anaconda3/envs/aaad-poc/lib/python3.9/site-packages/tensordict/tensordict.py:1141, inTensorDictBase()
1138 future.wait()
1139 return
-> 1141 def reduce(self, dst, op=dist.ReduceOp.SUM, async_op=False, return_premature=False):
1142 """Reduces the tensordict across all machines. 1143 1144 Only the process with ``rank`` dst is going to receive the final result. 1145 1146 """
1147 return self._reduce(dst, op, async_op, return_premature)
AttributeError: module 'torch.distributed' has no attribute 'ReduceOp'
System info
Describe the characteristic of your environment:
MacOSX 13.2.1
conda 23.3.1
python 3.9.18
torch 2.0.0.post2
numpy 1.25.2
Checklist
I have checked that there is no similar issue in the repo (required)
Describe the bug
TensorDict
does not work whentorch.distributed
is not available.To Reproduce
Take a machine where
torch.distributed.is_available()
equals toFalse
. Then importTensorDict
then it gives the following traceback:
System info
Describe the characteristic of your environment:
Checklist
The text was updated successfully, but these errors were encountered: