Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Could not find path to the "hhblits" binary #127

Open
addsg opened this issue Aug 23, 2023 · 0 comments
Open

Could not find path to the "hhblits" binary #127

addsg opened this issue Aug 23, 2023 · 0 comments

Comments

@addsg
Copy link

addsg commented Aug 23, 2023

hi
I tried to predict complex structure, and I met this problem

ValueError: Could not find path to the "hhblits" binary. Make sure it is installed on your system. Starting prediction... start to load params multimer.unifold.pt Traceback (most recent call last): File "/zhangyudi/Uni-Fold/unifold/inference.py", line 266, in <module> main(args) File "/zhangyudi/Uni-Fold/unifold/inference.py", line 91, in main model.load_state_dict(state_dict) File "/opt/anaconda3/envs/unifold/lib/python3.10/site-packages/torch/nn/modules/module.py", line 2041, in load_state_dict raise RuntimeError('Error(s) in loading state_dict for {}:\n\t{}'.format( RuntimeError: Error(s) in loading state_dict for AlphaFold: Missing key(s) in state_dict: "template_pair_embedder.linear.weight", "template_pair_embedder.linear.bias", "template_pointwise_att.mha.linear_q.weight", "template_pointwise_att.mha.linear_k.weight", "template_pointwise_att.mha.linear_v.weight", "template_pointwise_att.mha.linear_o.weight", "template_pointwise_att.mha.linear_o.bias", "structure_module.ipa.linear_q.bias", "structure_module.ipa.linear_kv.weight", "structure_module.ipa.linear_kv.bias", "structure_module.ipa.linear_kv_points.weight", "structure_module.ipa.linear_kv_points.bias". Unexpected key(s) in state_dict: "template_proj.output_linear.weight", "template_proj.output_linear.bias", "template_pair_embedder.z_layer_norm.weight", "template_pair_embedder.z_layer_norm.bias", "template_pair_embedder.z_linear.weight", "template_pair_embedder.z_linear.bias", "template_pair_embedder.linear.0.weight", "template_pair_embedder.linear.0.bias", "template_pair_embedder.linear.1.weight", "template_pair_embedder.linear.1.bias", "template_pair_embedder.linear.2.weight", "template_pair_embedder.linear.2.bias", "template_pair_embedder.linear.3.weight", "template_pair_embedder.linear.3.bias", "template_pair_embedder.linear.4.weight", "template_pair_embedder.linear.4.bias", "template_pair_embedder.linear.5.weight", "template_pair_embedder.linear.5.bias", "template_pair_embedder.linear.6.weight", "template_pair_embedder.linear.6.bias", "template_pair_embedder.linear.7.weight", "template_pair_embedder.linear.7.bias", "structure_module.ipa.linear_k.weight", "structure_module.ipa.linear_v.weight", "structure_module.ipa.linear_k_points.weight", "structure_module.ipa.linear_k_points.bias", "structure_module.ipa.linear_v_points.weight", "structure_module.ipa.linear_v_points.bias", "aux_heads.pae.linear.weight", "aux_heads.pae.linear.bias". size mismatch for input_embedder.linear_tf_z_i.weight: copying a param with shape torch.Size([128, 21]) from checkpoint, the shape in current model is torch.Size([128, 22]). size mismatch for input_embedder.linear_tf_z_j.weight: copying a param with shape torch.Size([128, 21]) from checkpoint, the shape in current model is torch.Size([128, 22]). size mismatch for input_embedder.linear_tf_m.weight: copying a param with shape torch.Size([256, 21]) from checkpoint, the shape in current model is torch.Size([256, 22]). size mismatch for input_embedder.linear_relpos.weight: copying a param with shape torch.Size([128, 73]) from checkpoint, the shape in current model is torch.Size([128, 65]). size mismatch for template_angle_embedder.linear_1.weight: copying a param with shape torch.Size([256, 34]) from checkpoint, the shape in current model is torch.Size([256, 57]). size mismatch for aux_heads.masked_msa.linear.weight: copying a param with shape torch.Size([22, 256]) from checkpoint, the shape in current model is torch.Size([23, 256]). size mismatch for aux_heads.masked_msa.linear.bias: copying a param with shape torch.Size([22]) from checkpoint, the shape in current model is torch.Size([23]).

can you give me some advice?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant