You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi @amasciotta, yes it builds llama_cpp_python from https://github.com/abetlen/llama-cpp-python which has llama.cpp as submodule. For now if you need to retain the C++ stuff, you just may want to build it, it is straightforward to compile these days with -DLLAMA_CUBLAS=on -DLLAMA_CUDA_F16=1
Hello! I am using the prebuilt container dustynv/llama_cpp, which contains the built C++ executables inside
/opt/llama.cpp
.However, trying to rebuild it from scratch to get some fixes from the main repo, I obtain a container that has no
/opt/llama.cpp
folder.I suspect that #422 broke something...
The text was updated successfully, but these errors were encountered: