We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
现有的模型加载方式会报错,找不到config文件,我是否可以直接用automodel代替,如下图所示
The text was updated successfully, but these errors were encountered:
现有的模型加载方式会报错,找不到config文件 -> Can you provide the corresponding error log? Thanks.
Sorry, something went wrong.
现有的模型加载方式会报错,找不到config文件 -> 能否提供相应的错误日志?谢谢。
Thanks for your feedback! Do u use the AutoPeftModelForCausalLM class here to load the model?
AutoPeftModelForCausalLM
您好,感谢您的工作!我想请教一下,使用AutoPeftModelForCausalLM加载模型后,参照finetune.py中的lora设置代码继续训练,出现下面报错如何解决?我确认设置了model.tokenizer,似乎没有成功 to_regress_embeds, attention_mask, targets, im_mask = self.interleav_wrap( File "/root/.cache/huggingface/modules/transformers_modules/xcomposer2-4khd/modeling_internlm_xcomposer2.py", line 226, in interleav_wrap part_tokens = self.tokenizer( TypeError: 'NoneType' object is not callable
myownskyW7
No branches or pull requests
现有的模型加载方式会报错,找不到config文件,我是否可以直接用automodel代替,如下图所示
The text was updated successfully, but these errors were encountered: