You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm trying to run the first Text Chat example using the published command line, the container seemed to have properly downloaded and built, but returning "No module named local_llm.chat.main; 'local_llm.chat' is a package and cannot be directly executed"
I'm getting the same also with the MultiModal chat example
I'm trying to run the first Text Chat example using the published command line, the container seemed to have properly downloaded and built, but returning "No module named local_llm.chat.main; 'local_llm.chat' is a package and cannot be directly executed"
I'm getting the same also with the MultiModal chat example
Any idea?
Thanks!
./run.sh --env HUGGINGFACE_TOKEN= $(./autotag local_llm) python3 -m local_llm.chat --api=mlc --model=meta-llama/Llama-2-7b-chat-hf
_Namespace(disable=[''], output='/tmp/autotag', packages=['local_llm'], prefer=['local', 'registry', 'build'], quiet=False, user='dustynv', verbose=False)
-- L4T_VERSION=35.2.1 JETPACK_VERSION=5.1 CUDA_VERSION=11.4
-- Finding compatible container image for ['local_llm']
dustynv/local_llm:r35.3.1
localuser:root being added to access control list
/usr/bin/python3: No module named local_llm.chat.main; 'local_llm.chat' is a package and cannot be directly executed_
The text was updated successfully, but these errors were encountered: