Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Incompatible with huggingface transformers? #248

Open
hxu105 opened this issue Apr 8, 2024 · 0 comments
Open

Incompatible with huggingface transformers? #248

hxu105 opened this issue Apr 8, 2024 · 0 comments

Comments

@hxu105
Copy link

hxu105 commented Apr 8, 2024

Howdy,

I am having one issue when I import torchdrug and transformers at the same time. Importing torchdrug will somehow disable some functionality for hugging face transformers and peft as well as torch.distributed.fsdp.FullySharedDataParallel.
image

In multiple GPU settings, importing torchdrug will cause AssertionError: Expects a fully sharded module but got FullyShardedDataParallel( (_fsdp_wrapped_module):... I just find these errors quite weird for me, if you can provide any hint or solution, I will be very grateful for your help.

Many thanks,

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant