You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Removing the --max-context-len parameter leads to another error:
File "/usr/local/lib/python3.8/dist-packages/tvm/_ffi/base.py", line 481, in raise_last_ffi_error
raise py_err
File "tvm/_ffi/_cython/./packed_func.pxi", line 56, in tvm._ffi._cy3.core.tvm_callback
File "/usr/local/lib/python3.8/dist-packages/mlc_llm/utils.py", line 46, in inner
return func(*args, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/mlc_llm/relax_model/param_manager.py", line 599, in get_item
load_torch_params_from_bin(torch_binname)
File "/usr/local/lib/python3.8/dist-packages/mlc_llm/relax_model/param_manager.py", line 559, in load_torch_params_from_bin
torch_params = torch.load(
File "/usr/local/lib/python3.8/dist-packages/torch/serialization.py", line 809, in load
return _load(opened_zipfile, map_location, pickle_module, **pickle_load_args)
File "/usr/local/lib/python3.8/dist-packages/torch/serialization.py", line 1172, in _load
result = unpickler.load()
File "/usr/local/lib/python3.8/dist-packages/torch/serialization.py", line 1165, in find_class
returnsuper().find_class(mod_name, name)
ModuleNotFoundError: No module named 'llava'
The text was updated successfully, but these errors were encountered:
@YoungjaeDev it looks like you are on JetPack 5, I would recommend moving to JetPack 6 to pick up the latest model support. I will try to backport some things to JetPack 5, but JetPack 6 GA will be out soon and moving forward will be the primary support
I encountered an issue while executing the following command:
./run.sh $(./autotag local_llm) python3 -m local_llm --api=mlc --model liuhaotian/llava-v1.6-vicuna-7b --max-context-len 768 --max-new-tokens 128
Removing the
--max-context-len
parameter leads to another error:The text was updated successfully, but these errors were encountered: