Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

is:issue Error while finding module specification for 'local_llm.test.asr' (ModuleNotFoundError: No module named 'local_llm.test') #464

Open
cfregly opened this issue Apr 4, 2024 · 4 comments

Comments

@cfregly
Copy link

cfregly commented Apr 4, 2024

local_llm.test module does not exist in latest docker image https://hub.docker.com/layers/dustynv/local_llm/r36.2.0/images/sha256-9533873e7ece872d154c8302036d896adb7b3b79e91337fdd8cedcdd67c20e89?context=explore

any idea what happened to that?

affects commands like this:

./run.sh dustynv/local_llm:r36.2.0 \
  python3 -m local_llm.test.asr --list-audio-devices
@cfregly
Copy link
Author

cfregly commented Apr 4, 2024

root@jetson:/# ls -al /opt/local_llm/local_llm/
total 84
drwxr-xr-x 1 root root  4096 Mar 15 14:52 .
drwxr-xr-x 1 root root  4096 Mar 14 00:01 ..
-rw-rw-r-- 1 root root  4979 Jan 24 03:56 agent.py
drwxr-xr-x 3 root root  4096 Mar 15 14:52 agents
drwxr-xr-x 3 root root  4096 Mar 15 14:52 chat
-rw-rw-r-- 1 root root  1750 Mar  1 06:25 completion.py
-rw-rw-r-- 1 root root   278 Mar  1 06:13 __init__.py
-rw-rw-r-- 1 root root 10120 Mar 15 07:45 local_llm.py
-rw-rw-r-- 1 root root   358 Mar  1 16:49 __main__.py
drwxr-xr-x 3 root root  4096 Mar 15 14:52 models
-rw-rw-r-- 1 root root  9726 Mar 15 07:36 plugin.py
drwxr-xr-x 4 root root  4096 Mar 15 14:52 plugins
-rw-rw-r-- 1 root root    51 Jan 25 23:57 requirements.txt
drwxr-xr-x 3 root root  4096 Mar 15 14:52 utils
drwxr-xr-x 3 root root  4096 Mar 15 14:52 vision
drwxr-xr-x 5 root root  4096 Mar 15 14:52 web

@cfregly
Copy link
Author

cfregly commented Apr 4, 2024

i manually copied from here: https://github.com/dusty-nv/jetson-containers/tree/7bf8827/packages/llm/local_llm

and now it works.

does this image need to be rebuilt? or should I switch to a different image? concerned that other stuff might be out of sync.

@dusty-nv
Copy link
Owner

dusty-nv commented Apr 4, 2024

Ahh yes sorry, I need to push the latest/final after I do some more testing on it (I changed a bunch of stuff in the CUDA/Python base containers)

For now you can revert commit 1a7fb07 and add this back into run.sh (or do the mount yourself, like it sounds like you are doing now)

--volume $ROOT/packages/llm/local_llm:/opt/local_llm/local_llm

@dusty-nv
Copy link
Owner

dusty-nv commented Apr 5, 2024

@cfregly FYI I reverted this in the dev branch in commit 7164b42 until it's properly fixed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants