You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In the Webstarter project by default we can find LlamaSharp.Backend.Cuda11 dependency.
When I added the LlamaSharp.Backend.CPU dependency to the project, it works well.
I have installed the CUDA libraries directly from Nvidia, and there is a RTX 2060 in my pc, and it has been properly detected.
However, the Webstarter project still doesnt seem to accept the Cuda support. I have also tried with LlamaSharp.Backend.Cuda12, but no luck.
Is there a setting that needs to be set?
I get the following error:
System.TypeInitializationException: The type initializer for 'LLama.Native.NativeApi' threw an exception.
---> LLama.Exceptions.RuntimeError: The native library cannot be found. It could be one of the following reasons:
No LLamaSharp backend was installed. Please search LLamaSharp.Backend and install one of them.
You are using a device with only CPU but installed cuda backend. Please install cpu backend instead.
The backend is not compatible with your system cuda environment. Please check and fix it. If the environment is expected not to be changed, then consider build llama.cpp from source or submit an issue to LLamaSharp.
One of the dependency of the native library is missed.
The text was updated successfully, but these errors were encountered:
In the Webstarter project by default we can find LlamaSharp.Backend.Cuda11 dependency.
When I added the LlamaSharp.Backend.CPU dependency to the project, it works well.
I have installed the CUDA libraries directly from Nvidia, and there is a RTX 2060 in my pc, and it has been properly detected.
However, the Webstarter project still doesnt seem to accept the Cuda support. I have also tried with LlamaSharp.Backend.Cuda12, but no luck.
Is there a setting that needs to be set?
I get the following error:
System.TypeInitializationException: The type initializer for 'LLama.Native.NativeApi' threw an exception.
---> LLama.Exceptions.RuntimeError: The native library cannot be found. It could be one of the following reasons:
The text was updated successfully, but these errors were encountered: