Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

运行报错 #55

Open
renllll opened this issue Aug 30, 2023 · 4 comments
Open

运行报错 #55

renllll opened this issue Aug 30, 2023 · 4 comments

Comments

@renllll
Copy link

renllll commented Aug 30, 2023

home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/transformers/generation/utils.py:1201: UserWarning: You have modified the pretrained model configuration to control generation. This is a deprecated strategy to control generation and will be removed soon, in a future version. Please use a generation configuration file (see https://huggingface.co/docs/transformers/main_classes/text_generation)
warnings.warn(
Traceback (most recent call last):
File "/media/remotesense/c076bdaf-88b9-4573-88f1-b4bdb3af3183/jack/chatglm-med/Med-ChatGLM-main/infer.py", line 12, in <module>
response, history = model.chat(tokenizer, "问题:" + a.strip() + '\n答案:', max_length=256, history=[])
File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/media/remotesense/c076bdaf-88b9-4573-88f1-b4bdb3af3183/jack/chatglm-med/Med-ChatGLM-main/modeling_chatglm.py", line 1114, in chat
outputs = self.generate(**input_ids, **gen_kwargs)
File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/transformers/generation/utils.py", line 1452, in generate
return self.sample(
File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/transformers/generation/utils.py", line 2465, in sample
model_inputs = self.prepare_inputs_for_generation(input_ids, **model_kwargs)
File "/media/remotesense/c076bdaf-88b9-4573-88f1-b4bdb3af3183/jack/chatglm-med/Med-ChatGLM-main/modeling_chatglm.py", line 979, in prepare_inputs_for_generation
mask_position = seq.index(mask_token)
ValueError: 130001 is not in list

@fengbrute
Copy link

我也遇到同样的情况,请问,你解决了么?
运行平台是:macbookpro m1pro

@ymx10086
Copy link

我也遇到了同样的问题

@pengcheng-yan
Copy link

我也遇到了同样的问题 大家都解决了嘛

@Charon-HN
Copy link

home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/transformers/generation/utils.py:1201: UserWarning: You have modified the pretrained model configuration to control generation. This is a deprecated strategy to control generation and will be removed soon, in a future version. Please use a generation configuration file (see https://huggingface.co/docs/transformers/main_classes/text_generation) warnings.warn( Traceback (most recent call last): File "/media/remotesense/c076bdaf-88b9-4573-88f1-b4bdb3af3183/jack/chatglm-med/Med-ChatGLM-main/infer.py", line 12, in response, history = model.chat(tokenizer, "问题:" + a.strip() + '\n答案:', max_length=256, history=[]) File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "/media/remotesense/c076bdaf-88b9-4573-88f1-b4bdb3af3183/jack/chatglm-med/Med-ChatGLM-main/modeling_chatglm.py", line 1114, in chat outputs = self.generate(**input_ids, **gen_kwargs) File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context return func(*args, **kwargs) File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/transformers/generation/utils.py", line 1452, in generate return self.sample( File "/home/remotesense/anaconda3/envs/bencao/lib/python3.9/site-packages/transformers/generation/utils.py", line 2465, in sample model_inputs = self.prepare_inputs_for_generation(input_ids, **model_kwargs) File "/media/remotesense/c076bdaf-88b9-4573-88f1-b4bdb3af3183/jack/chatglm-med/Med-ChatGLM-main/modeling_chatglm.py", line 979, in prepare_inputs_for_generation mask_position = seq.index(mask_token) ValueError: 130001 is not in list

看项目的READER.md,里面常见问题有关于这个的介绍说明

Q: 报错 ValueError: 130001 is not in list / ValueError: 150001 is not in list
A: 由于相关依赖更新较快,版本的不同会导致一些bug
(1) 如果报错为150001 is not in list,请将仓库更新至最新版本
(2) 如果报错为130001 is not in list,请将仓库回退至commit为cb9d827的版本,链接为https://github.com/SCIR-HI/Med-ChatGLM/tree/cb9d82738021ec6f82b307d6031e8595a49dcb00

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants