Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Dimension mismatch for einsum() in ops.py #167

Open
daniel-dpq opened this issue Mar 25, 2023 · 3 comments
Open

Dimension mismatch for einsum() in ops.py #167

daniel-dpq opened this issue Mar 25, 2023 · 3 comments

Comments

@daniel-dpq
Copy link

When using DAP, I encountered the following error:

  File "/home/pdengad/protein/fastfold/fastfold/model/fastnn/ops.py", line 153, in forward
    norm = torch.einsum('bsid,bsjd->bijd', M_mask_col, M_mask) + 1e-3
  File "/home/pdengad/anaconda3/envs/fastfold/lib/python3.8/site-packages/torch/functional.py", line 360, in einsum
    return _VF.einsum(equation, operands)  # type: ignore[attr-defined]
RuntimeError: einsum(): the number of subscripts in the equation (4) does not match the number of dimensions (5) for operand 0 and no ellipsis was given

I am running multimer mode. In my case, M_mask has 5 dimensions: [1, batch_size, N_seq, N_res, 1]. The first dimension (1) is from line 235 in fastnn.msa.py: msa_mask = msa_mask.unsqueeze(0). The last dimension is from line 149 in fastnn.ops.py: M_mask = M_mask.unsqueeze(-1). I am wondering why there should be only 4 dimensions.
Here are the dimensions of some input features:

aatype torch.Size([1, 134, 1])
residue_index torch.Size([1, 134, 1])
msa torch.Size([1, 252, 134, 1])
asym_id torch.Size([1, 134, 1])
sym_id torch.Size([1, 134, 1])
entity_id torch.Size([1, 134, 1])
seq_mask torch.Size([1, 134, 1])
msa_mask torch.Size([1, 252, 134, 1])
target_feat torch.Size([1, 134, 21, 1])
extra_msa torch.Size([1, 260, 134, 1])
extra_deletion_matrix torch.Size([1, 260, 134, 1])
extra_msa_mask torch.Size([1, 260, 134, 1])
msa_feat torch.Size([1, 252, 134, 49, 1])

Thanks for any kind help.

@semal
Copy link

semal commented Mar 27, 2023

Maybe you should add the batch dimension to parameters too.

@Gy-Lu
Copy link
Contributor

Gy-Lu commented Mar 27, 2023

Hi, it seems that these features were not produced by FastFold preprocess. Did you load a pickle file or other things as features?

@daniel-dpq
Copy link
Author

Thanks for your reply. Yes, I load my own features. I finally figure out that I need to remove the batch_size dimension in the input features.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants