You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Excellent work!!!
If I want to conduct full parameter training (non-lora) on llama2 13B now, where should I modify the code in stage1-stage3 to achieve the following two things:
(1) Change the base to 13B
(2) Full parameter training
thanks
The text was updated successfully, but these errors were encountered:
for llama 2 13B : change the llama model path in the training config file for each stage.
to turn off lora and use full parameter training :
comment the LORA setting in minigpt4/models/mini_gpt4_llama_v2.py
Excellent work!!!
If I want to conduct full parameter training (non-lora) on llama2 13B now, where should I modify the code in stage1-stage3 to achieve the following two things:
(1) Change the base to 13B
(2) Full parameter training
thanks
The text was updated successfully, but these errors were encountered: