Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RuntimeError: Mismatch in shape #124

Open
Algabri opened this issue Dec 19, 2022 · 1 comment
Open

RuntimeError: Mismatch in shape #124

Algabri opened this issue Dec 19, 2022 · 1 comment

Comments

@Algabri
Copy link

Algabri commented Dec 19, 2022

I am trying to run train_hopenet.py

python3 train_hopenet.py --dataset AFLW2000 --data_dir datasets/AFLW2000 --filename_list datasets/AFLW2000/files.txt --output_string er

I got this error:

Loading data.

/home/redhwan/.local/lib/python3.8/site-packages/torch/optim/adam.py:90: UserWarning: optimizer contains a parameter group with duplicate parameters; in future, this will cause an error; see github.com/pytorch/pytorch/issues/40967 for more information
  super(Adam, self).__init__(params, defaults)
Ready to train network.
Traceback (most recent call last):
  File "train_hopenet.py", line 193, in <module>
    torch.autograd.backward(loss_seq, grad_seq)
  File "/home/redhwan/.local/lib/python3.8/site-packages/torch/autograd/__init__.py", line 166, in backward
    grad_tensors_ = _make_grads(tensors, grad_tensors_, is_grads_batched=False)
  File "/home/redhwan/.local/lib/python3.8/site-packages/torch/autograd/__init__.py", line 50, in _make_grads
    raise RuntimeError("Mismatch in shape: grad_output["
RuntimeError: Mismatch in shape: grad_output[0] has a shape of torch.Size([1]) and output[0] has a shape of torch.Size([]).


How can I solve it?

Note: torch.version = 1.12.0+cu102

@Algabri
Copy link
Author

Algabri commented Dec 20, 2022

I changed this line:

grad_seq = [torch.ones(1).cuda(gpu) for _ in range(len(loss_seq))]

To be:

grad_seq = [torch.tensor(1, dtype=torch.float).cuda(gpu) for _ in range(len(loss_seq))]

It is working fine now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant