You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thanks for your great projects.
I was trying to evaluate the model without Tuning and with Tuning. I wondered if we can evaluate the model with the original model.
Also, if I want to use models except LLAMA Bloom and GPT-J, do I have to write my own part?
Thanks
The text was updated successfully, but these errors were encountered:
Yes, you can evaluate the original models by commenting Line 222-227 in evaluate.py. And if the model you wanna use has already been supported, you can just indicate the argument --base_model. If not, then you need to indicate the argument --target_modules or add the mapping of the model to target modules in LLM-Adapters/peft/src/peft/mapping.py.
If you have any questions on add unsupported models to the code base, please let us know and we will help with it!
Dear Author,
Thanks for your great projects.
I was trying to evaluate the model without Tuning and with Tuning. I wondered if we can evaluate the model with the original model.
Also, if I want to use models except LLAMA Bloom and GPT-J, do I have to write my own part?
Thanks
The text was updated successfully, but these errors were encountered: