Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Erorr when running run.sh for local_llm / __main__.py: error: unrecognized arguments: --max-context-len 768 #462

Open
YoungjaeDev opened this issue Apr 4, 2024 · 1 comment

Comments

@YoungjaeDev
Copy link

YoungjaeDev commented Apr 4, 2024

I encountered an issue while executing the following command:

./run.sh $(./autotag local_llm)   python3 -m local_llm --api=mlc     --model liuhaotian/llava-v1.6-vicuna-7b     --max-context-len 768     --max-new-tokens 128

Removing the --max-context-len parameter leads to another error:

  File "/usr/local/lib/python3.8/dist-packages/tvm/_ffi/base.py", line 481, in raise_last_ffi_error
    raise py_err
  File "tvm/_ffi/_cython/./packed_func.pxi", line 56, in tvm._ffi._cy3.core.tvm_callback
  File "/usr/local/lib/python3.8/dist-packages/mlc_llm/utils.py", line 46, in inner
    return func(*args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/mlc_llm/relax_model/param_manager.py", line 599, in get_item
    load_torch_params_from_bin(torch_binname)
  File "/usr/local/lib/python3.8/dist-packages/mlc_llm/relax_model/param_manager.py", line 559, in load_torch_params_from_bin
    torch_params = torch.load(
  File "/usr/local/lib/python3.8/dist-packages/torch/serialization.py", line 809, in load
    return _load(opened_zipfile, map_location, pickle_module, **pickle_load_args)
  File "/usr/local/lib/python3.8/dist-packages/torch/serialization.py", line 1172, in _load
    result = unpickler.load()
  File "/usr/local/lib/python3.8/dist-packages/torch/serialization.py", line 1165, in find_class
    return super().find_class(mod_name, name)
ModuleNotFoundError: No module named 'llava'
@dusty-nv
Copy link
Owner

dusty-nv commented Apr 4, 2024

@YoungjaeDev it looks like you are on JetPack 5, I would recommend moving to JetPack 6 to pick up the latest model support. I will try to backport some things to JetPack 5, but JetPack 6 GA will be out soon and moving forward will be the primary support

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants