New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Debian 12 x LLamaSharp 0.11.2 Crashed Silently #668
Comments
Could you share the link to the model that you are trying to load to make a test? |
I run it with model "llama-2-7b-chat.Q4_K_M.gguf" on the server but it's good on my M1 MacBook Pro (MacOS: Sonoma 14.4). No idea why it was terminated silently. |
After investigated to llama.cpp, I got why it's occurred core dump! How can I do next step? #./main -ngl 32 -m /user/models/llama-2-7b-chat.Q4_K_M.gguf --color -c 4096 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "YOUR PROMPT..." |
if the problem happens with llama.cpp examples (main) you should open the issue to llama.cpp. |
@martindevans After I refreshed to newest llama.cpp and recompiled these projects, then I replaced with two files LLamaSharp.dll and libllama.so to my dotnet project under Debian 12, it's workable, so amazing! |
@kuan2019 The binary in master branch was updated last week. Could you please try once more with the current master branch? |
HI,
I was running a console application with LLamaSharp 0.11.2 under Debian 12 then it been crashed silently without any exceptions when it was loading the file.
using var model = LLamaWeights.LoadFromFile(parameters);
How can I fix this issue? The information of environment is as below,
best regards,
The text was updated successfully, but these errors were encountered: