Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

模型加载报错,提示这个错误,对应的版本应该是什么呢 #33

Open
zhangatao opened this issue May 21, 2023 · 0 comments

Comments

@zhangatao
Copy link

from transformers import AutoTokenizer, AutoModel
model_med = AutoModel.from_pretrained("./chatglm-6b-med/", trust_remote_code=True)

File ~/.cache/huggingface/modules/transformers_modules/modeling_chatglm.py:818, in ChatGLMModel.init(self, config, empty_init)
816 self.hidden_size_per_attention_head = self.hidden_size // self.num_attention_heads
817 self.position_encoding_2d = config.position_encoding_2d
--> 818 self.pre_seq_len = config.pre_seq_len
819 self.prefix_projection = config.prefix_projection
821 self.word_embeddings = init_method(
822 torch.nn.Embedding,
823 num_embeddings=self.vocab_size, embedding_dim=self.hidden_size,
824 dtype=self.params_dtype
825 )

File /opt/conda/envs/xs_llm/lib/python3.8/site-packages/transformers/configuration_utils.py:260, in PretrainedConfig.getattribute(self, key)
258 if key != "attribute_map" and key in super().getattribute("attribute_map"):
259 key = super().getattribute("attribute_map")[key]
--> 260 return super().getattribute(key)

AttributeError: 'ChatGLMConfig' object has no attribute 'pre_seq_len'

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant