Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: The version of PEFT you are using is not compatible, please use a version that is greater than 0.5.0 #47

Open
nbasyl opened this issue Nov 15, 2023 · 6 comments

Comments

@nbasyl
Copy link

nbasyl commented Nov 15, 2023

Hi,

First of all, thanks for the great work.
I came across an error while attempting to replicate the LoRA result. The error message I received is "ValueError: The version of PEFT you are using is not compatible. Please use a version greater than 0.5.0." This error originates from the finetune.py file, specifically from the Trainer.train() function. However, I'm unable to resolve this issue because the PEFT package installed from the local folder is of version 0.3.0.

Can you assist me in resolving this error?
Below is the full error message:

File "finetune.py", line 361, in
Traceback (most recent call last):
File "finetune.py", line 361, in
fire.Fire(train)
File "/home/sliuau/miniconda3/envs/llm-peft/lib/python3.8/site-packages/fire/core.py", line 141, in Fire
fire.Fire(train)
File "/home/sliuau/miniconda3/envs/llm-peft/lib/python3.8/site-packages/fire/core.py", line 141, in Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "/home/sliuau/miniconda3/envs/llm-peft/lib/python3.8/site-packages/fire/core.py", line 475, in _Fire
component_trace = _Fire(component, args, parsed_flag_args, context, name)
File "/home/sliuau/miniconda3/envs/llm-peft/lib/python3.8/site-packages/fire/core.py", line 475, in _Fire
component, remaining_args = _CallAndUpdateTrace(
File "/home/sliuau/miniconda3/envs/llm-peft/lib/python3.8/site-packages/fire/core.py", line 691, in _CallAndUpdateTrace
component, remaining_args = _CallAndUpdateTrace(
File "/home/sliuau/miniconda3/envs/llm-peft/lib/python3.8/site-packages/fire/core.py", line 691, in _CallAndUpdateTrace
component = fn(*varargs, **kwargs)
File "finetune.py", line 328, in train
component = fn(*varargs, **kwargs)
File "finetune.py", line 328, in train
trainer.train(resume_from_checkpoint=resume_from_checkpoint)
File "/home/sliuau/miniconda3/envs/llm-peft/lib/python3.8/site-packages/transformers/trainer.py", line 1555, in train
trainer.train(resume_from_checkpoint=resume_from_checkpoint)
File "/home/sliuau/miniconda3/envs/llm-peft/lib/python3.8/site-packages/transformers/trainer.py", line 1555, in train
return inner_training_loop(
File "/home/sliuau/miniconda3/envs/llm-peft/lib/python3.8/site-packages/transformers/trainer.py", line 1965, in _inner_training_loop
return inner_training_loop(
File "/home/sliuau/miniconda3/envs/llm-peft/lib/python3.8/site-packages/transformers/trainer.py", line 1965, in _inner_training_loop
self._load_best_model()
File "/home/sliuau/miniconda3/envs/llm-peft/lib/python3.8/site-packages/transformers/trainer.py", line 2184, in _load_best_model
self._load_best_model()
File "/home/sliuau/miniconda3/envs/llm-peft/lib/python3.8/site-packages/transformers/trainer.py", line 2184, in _load_best_model
model.load_adapter(self.state.best_model_checkpoint, model.active_adapter)
File "/home/sliuau/miniconda3/envs/llm-peft/lib/python3.8/site-packages/transformers/integrations/peft.py", line 137, in load_adapter
check_peft_version(min_version=MIN_PEFT_VERSION)
model.load_adapter(self.state.best_model_checkpoint, model.active_adapter)
File "/home/sliuau/miniconda3/envs/llm-peft/lib/python3.8/site-packages/transformers/utils/peft_utils.py", line 120, in check_peft_version
File "/home/sliuau/miniconda3/envs/llm-peft/lib/python3.8/site-packages/transformers/integrations/peft.py", line 137, in load_adapter
raise ValueError(
check_peft_version(min_version=MIN_PEFT_VERSION)
File "/home/sliuau/miniconda3/envs/llm-peft/lib/python3.8/site-packages/transformers/utils/peft_utils.py", line 120, in check_peft_version
ValueError: The version of PEFT you are using is not compatible, please use a version that is greater than 0.5.0
raise ValueError(
ValueError: The version of PEFT you are using is not compatible, please use a version that is greater than 0.5.0

Thanks,
Sean

@nbasyl
Copy link
Author

nbasyl commented Nov 15, 2023

Hi I think this may be due to the Transformers package version being too new to the local peft folder, you can check here: https://github.com/huggingface/transformers/blob/2fc33ebead50383f7707b17f0e2a178d86347d10/src/transformers/integrations/peft.py#L33

@HZQ950419
Copy link
Collaborator

Hi,

Please try to uninstall PEFT to run the code. Our code base is extended from PEFT 0.3.0 but doesn't rely on the PEFT package.

@nbasyl
Copy link
Author

nbasyl commented Nov 15, 2023

Hi Thanks for the quick reply, but I am a bit confused, as if PEFT is uninstalled then what would happen when the transformers call peft modules?
Also, Is it possible to have your Wechat so we can communicate more efficiently.

@HZQ950419
Copy link
Collaborator

Hi,

The path of our extended PEFT package is indicated in finetune.py Line 16.

Please send me an email to add me on WeChat, zhiqiang_hu@mymail.sutd.edu.sg

@tk1363704
Copy link

Hi I think this may be due to the Transformers package version being too new to the local peft folder, you can check here: https://github.com/huggingface/transformers/blob/2fc33ebead50383f7707b17f0e2a178d86347d10/src/transformers/integrations/peft.py#L33

Then what version of the transformers is compatible with your 0.3.0 PEFT?

@HZQ950419
Copy link
Collaborator

Hi I think this may be due to the Transformers package version being too new to the local peft folder, you can check here: https://github.com/huggingface/transformers/blob/2fc33ebead50383f7707b17f0e2a178d86347d10/src/transformers/integrations/peft.py#L33

Then what version of the transformers is compatible with your 0.3.0 PEFT?

Hi, you don't have to install PEFT, just uninstall PEFT and run our code should be good.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants