Replies: 2 comments
-
It probably is because the driver model of the K80 is set to TCC, which is unsupported in WSL2. You can verify this by running nvidia-smi -i 0 -q in command prompt/powershell. See: |
Beta Was this translation helpful? Give feedback.
-
This worked for me: https://www.reddit.com/r/pcmods/comments/nhfwh7/guide_using_an_nvidia_tesla_k80_datacenter_gpu/ However, I'm running into problems using it with pytorch. Basic functionality works, but when I try to use more advanced features, I get:" cuDNN error: CUDNN_STATUS_NOT_INITIALIZED I'm guessing I didn't set up the toolkit properly on the WSL side, and it's not a problem with the virtualized access to the GPU. |
Beta Was this translation helpful? Give feedback.
-
Hello together,
Is there a way to get a Tesla K80 detected in WSL2 or at least set up enough to use it in a Docker container.
If so, I would be happy if someone could help me with the setup. The topics I have found here so far have unfortunately not yet led to success. I'm not getting any further at the moment.
I have installed with WSL 2 Ubuntu. And follow these instructions.
https://developer.nvidia.com/cuda-downloads?target_os=Linux&target_arch=x86_64&Distribution=WSL-Ubuntu&target_version=2.0&target_type=deb_local
I also tryed this one.
https://www.forecr.io/blogs/installation/nvidia-docker-installation-for-ubuntu-in-wsl-2
And this one.
https://medium.com/htc-research-engineering-blog/nvidia-docker-on-wsl2-f891dfe34ab
This will not work, because no other card has the same chipset like the K80.
https://www.reddit.com/r/bashonubuntuonwindows/comments/kgx83d/any_way_to_run_nvidia_tesla_card_on_wsl2/
In end up every time, that wsl ubuntu dont see any nvida card.
Thanks for your help.
Beta Was this translation helpful? Give feedback.
All reactions