Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix for the Transformer optimization warning with IPEX #3266

Closed
wants to merge 3 commits into from

Conversation

bvhari
Copy link
Contributor

@bvhari bvhari commented Apr 15, 2024

Fix for the following warning: "UserWarning: Transformer opitmization is only support on XPU now."

Fix for the following warning: "UserWarning: Transformer opitmization is only support on XPU now."
Explicitly set the value of patch_model_to to the intended torch device obtained from get_torch_device. This fixes the Transformer warning.
Currently, it takes the current device of the model, which is cpu as the model is loaded in to cpu by deault.
Fix for OOM at VAE decode after switching models
@bvhari bvhari changed the title Fix for the Transformer optimization warning Fix for the Transformer optimization warning with IPEX Apr 16, 2024
@simonlui
Copy link
Contributor

@bvhari Sorry I didn't see this earlier but this was fixed via the PR I made at #3388. Hopefully you are able to close this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants