Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Installed intel-extension-for-transformers and I get an error - No module named 'intel_extension_for_pytorch'" #1230

Open
sungkim11 opened this issue Feb 1, 2024 · 6 comments
Assignees

Comments

@sungkim11
Copy link

I am using Arc770 GPU on Windows 11

  1. I have installed WSL2
  2. I have installed miniconda
  3. I follow instruction - "pip install intel-extension-for-transformers"
  4. Run the example GPU code and I get an error message:
    "Exception has occurred: ModuleNotFoundError
    No module named 'intel_extension_for_pytorch'"
@airMeng
Copy link
Collaborator

airMeng commented Feb 1, 2024

hi @sungkim11 , you could refer to gpu instruction to enable WOQ on intel GPU

@sungkim11
Copy link
Author

Why is this asking for username/password:

python -m pip install torch==2.1.2 -f https://developer.intel.com/ipex-whl-stable-xpu

source /opt/intel/oneapi/setvars.sh

git clone https://github.com/intel-innersource/frameworks.ai.pytorch.ipex-gpu.git ipex-gpu
cd ipex-gpu
git checkout -b dev/QLLM origin/dev/QLLM
git submodule update --init --recursive

Pip install -r requirements.txt
python setup.py install

@sungkim11
Copy link
Author

I cannot get pass this step. Any help?

@airMeng
Copy link
Collaborator

airMeng commented Feb 2, 2024

Why is this asking for username/password:

python -m pip install torch==2.1.2 -f https://developer.intel.com/ipex-whl-stable-xpu

source /opt/intel/oneapi/setvars.sh

git clone https://github.com/intel-innersource/frameworks.ai.pytorch.ipex-gpu.git ipex-gpu cd ipex-gpu git checkout -b dev/QLLM origin/dev/QLLM git submodule update --init --recursive

Pip install -r requirements.txt python setup.py install

sorry, @sungkim11, our fault, should be https://github.com/intel/intel-extension-for-pytorch.git, have updated in #1240, please try again

@sungkim11
Copy link
Author

Error:

UserWarning: Attempted to use ninja as the BuildExtension backend but we could not find ninja.. Falling back to using the slow distutils backend.
warnings.warn(msg.format('we could not find ninja.'))
running build_clib
WARNING: Please install flake8 by pip install -r requirements-flake8.txt to check format!

@airMeng
Copy link
Collaborator

airMeng commented Feb 2, 2024

Error:

UserWarning: Attempted to use ninja as the BuildExtension backend but we could not find ninja.. Falling back to using the slow distutils backend. warnings.warn(msg.format('we could not find ninja.')) running build_clib WARNING: Please install flake8 by pip install -r requirements-flake8.txt to check format!

these are just some warnings, can you paste the errors here? or the full log?

VincyZhang pushed a commit to VincyZhang/intel-extension-for-transformers that referenced this issue Feb 5, 2024
Signed-off-by: Lv, Liang1 <liang1.lv@intel.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants