We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Trying to run a variety of ggml models from TheBloke leads to this error: GGML_ASSERT: llama-cpp/ggml.c:6270: ggml_nelements(a) == ne0*ne1*ne2
GGML_ASSERT: llama-cpp/ggml.c:6270: ggml_nelements(a) == ne0*ne1*ne2
Wondering if anyone else is experiencing this, and what the issue might be?
The text was updated successfully, but these errors were encountered:
Related: ggerganov/llama.cpp#2445 (comment)
Sorry, something went wrong.
Probably another issue with the currently used ggml version, a re-sync with the current main branch of llama.cpp is probably needed.
llama.cpp
I actually did that and found a failure on the same assert line. The linked comment said rolling the version back worked best.
I'm wondering if this assert is assuming constant layer sizes, so any modification like The Bloke does might be causing the failure?
No branches or pull requests
Trying to run a variety of ggml models from TheBloke leads to this error:
GGML_ASSERT: llama-cpp/ggml.c:6270: ggml_nelements(a) == ne0*ne1*ne2
Wondering if anyone else is experiencing this, and what the issue might be?
The text was updated successfully, but these errors were encountered: