Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to load models plus-16B and plus-6B #39

Open
daubaris opened this issue Jun 28, 2023 · 0 comments
Open

Unable to load models plus-16B and plus-6B #39

daubaris opened this issue Jun 28, 2023 · 0 comments

Comments

@daubaris
Copy link

daubaris commented Jun 28, 2023

Hi, thank you for your work. I'm trying to use CodeT5+ model types plus-16B and plus-6B. However, when running, I get an error:

ValueError: CodeT5pEncoderDecoderModel does not support "device_map='auto'". To implement support, the modelclass needs to implement the "_no_split_modules" attribute.

The code I'm using is the same as provided in the examples:

from codetf.models import load_model_pipeline

code_generation_model = load_model_pipeline(model_name="codet5", task="pretrained",
            model_type="plus-6B", is_eval=True,
            load_in_8bit=True, load_in_4bit=False, weight_sharding=False)

result = code_generation_model.predict(["def print_hello_world():"])
print(result)

Any ideas on how the issue could be resolved?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant