-
Notifications
You must be signed in to change notification settings - Fork 189
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Installed intel-extension-for-transformers and I get an error - No module named 'intel_extension_for_pytorch'" #1230
Comments
hi @sungkim11 , you could refer to gpu instruction to enable WOQ on intel GPU |
Why is this asking for username/password: python -m pip install torch==2.1.2 -f https://developer.intel.com/ipex-whl-stable-xpu source /opt/intel/oneapi/setvars.sh git clone https://github.com/intel-innersource/frameworks.ai.pytorch.ipex-gpu.git ipex-gpu Pip install -r requirements.txt |
I cannot get pass this step. Any help? |
sorry, @sungkim11, our fault, should be |
Error: UserWarning: Attempted to use ninja as the BuildExtension backend but we could not find ninja.. Falling back to using the slow distutils backend. |
these are just some warnings, can you paste the errors here? or the full log? |
Signed-off-by: Lv, Liang1 <liang1.lv@intel.com>
I am using Arc770 GPU on Windows 11
"Exception has occurred: ModuleNotFoundError
No module named 'intel_extension_for_pytorch'"
The text was updated successfully, but these errors were encountered: