-
Notifications
You must be signed in to change notification settings - Fork 6
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: Support Python 3.12 #28
Comments
Yeah, it's not in an usable state yet, especially on Arc... |
Any update on this?, I tried using 2024.0 and 2024.1 version of oneapi and with latest version of jax as well as the version prescribed in the readme. Both didnt work. using 2024.0 version of oneapi and jax 0.4.25, here is the error I am getting on a Intel GPU Max system: >>> import jax
>>> jax.local_devices()
INFO: Intel Extension for OpenXLA version: 0.3.0, commit: 9a484818
Jax plugin configuration error: Exception when calling jax_plugins.intel_extension_for_openxla.initialize()
Traceback (most recent call last):
File "/home/sdp/.conda/envs/jax/lib/python3.11/site-packages/jax/_src/xla_bridge.py", line 482, in discover_pjrt_plugins
plugin_module.initialize()
File "/home/sdp/.conda/envs/jax/lib/python3.11/site-packages/jax_plugins/intel_extension_for_openxla/__init__.py", line 39, in initialize
c_api = xb.register_plugin("xpu",
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sdp/.conda/envs/jax/lib/python3.11/site-packages/jax/_src/xla_bridge.py", line 544, in register_plugin
c_api = xla_client.load_pjrt_plugin_dynamically(plugin_name, library_path) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/sdp/.conda/envs/jax/lib/python3.11/site-packages/jaxlib/xla_client.py", line 155, in load_pjrt_plugin_dynamically
return _xla.load_pjrt_plugin(plugin_name, library_path, c_api=None)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
jaxlib.xla_extension.XlaRuntimeError: INTERNAL: Failed to open /home/sdp/.conda/envs/jax/lib/python3.11/site-packages/jax_plugins/intel_extension_for_openxla/pjrt_plugin_xpu.so: /home/sdp/.conda/envs/jax/lib/python3.11/site-packages/jax_plugins/intel_extension_for_openxla/pjrt_plugin_xpu.so: undefined symbol: _ZNK4sycl3_V16detail16AccessorBaseHost25isMemoryObjectUsedByGraphEv
[CpuDevice(id=0)]
>>> Using latest jax, 2024.1 oneapi and latest openxla I am getting a core dump. |
|
I am currently using 2024.1 and it stopped detecting my GPU. The output of |
Do you meet the undefined symbol error even with 2024.1? If so, can you help to
|
The undefined symbol is fixed when upgraded to 2024.1. However, it stopped detecting the pjrt plugin/GPU.
|
@qnixsynapse JAX v0.4.25 doesn't match released Extension v0.3.0, so please try to use any of the following 2 solutions first:
More version-matching info can be found in the release notes: https://github.com/intel/intel-extension-for-openxla/releases. We will add a table in the home page to tell the matching info later. |
@Zantares Same issue with v0.4.24: Edit: It seems a python version mismatch, reason why pip is pulling a pre release old pywheel of openxla plugin. Arch Linux has python version: 3.12.3, the new wheels are built with 3.11. |
Thanks for the trial and the new info. We will check the error with Python 3.12 on Arc. BTW, isn't Extension v0.3.0 installed but an old version if it's mismatched? |
That's it. We will discuss it internally first to see if we can release more Python packages (means more test processes). Right now you can try to build it by yourself if Python 3.12 is hard requested. |
I tried to build it today, hit with an error. Tbh, I am not familiar with bazel.
Removing(commenting that option) gives this error:
|
Hi @qnixsynapse thanks for your trial, but I have never saw this error before... Have you missed any steps in the instruction https://github.com/intel/intel-extension-for-openxla?tab=readme-ov-file#install-from-source-build like |
Yes. I was following that while trying to build. And I ran I tried to build two times but I am getting the exact same error.. As I mentioned before, I am not familiar with bazel. My OS is an up to date Arch Linux on an Intel 12th gen (CPU + ARC A750) PC. The Intel oneapi basekit is latest and is installed in /opt. Just curious, what is the version of bazel that is used to build this package? |
I'm using Bazel 6.1.0. The configure script will check Bazel version: https://github.com/intel/intel-extension-for-openxla/blob/main/configure.py#L801, but I'm not sure what will happen if use a very new Bazel... |
Got it. The version of bazel on my system is 7.1.1. This might be creating the problem. |
@qnixsynapse |
@feng-intel Yes, with the latest release tag. |
This is the error I am getting:
The text was updated successfully, but these errors were encountered: