You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
python -m awq.entry --model_path awq_cache/llama3-8b-w4-g128.pt
--w_bit 4 --q_group_size 128
--run_awq --dump_awq awq_cache/llama3-8b-w4-g128.pt
Traceback (most recent call last):
File "/root/miniconda3/envs/awq/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/root/miniconda3/envs/awq/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/workspace/llm-awq/awq/entry.py", line 15, in
from awq.quantize.pre_quant import run_awq, apply_awq
File "/workspace/llm-awq/awq/quantize/pre_quant.py", line 12, in
from tinychat.models import LlavaLlamaForCausalLM
File "/workspace/llm-awq/tinychat/models/init.py", line 1, in
from .falcon import FalconForCausalLM
File "/workspace/llm-awq/tinychat/models/falcon.py", line 11, in
import awq_inference_engine
ModuleNotFoundError: No module named 'awq_inference_engine'
The text was updated successfully, but these errors were encountered:
Hi @Alpslee , thank you for your interest in AWQ & TinyChat. Did you run the following commands to build the awq_inference_engine?
cd awq/kernels
python setup.py install
i forgot this step,thanks. now i got new error:
from transformers.models.gemma.modeling_gemma import GemmaRMSNorm
ModuleNotFoundError: No module named 'transformers.models.gemma'
i tried to update transformers but got this:
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
awq 0.1.0 requires transformers==4.36.2, but you have transformers 4.40.2 which is incompatible.
vila 1.0.0 requires transformers==4.36.2, but you have transformers 4.40.2 which is incompatible.
python -m awq.entry --model_path awq_cache/llama3-8b-w4-g128.pt
--w_bit 4 --q_group_size 128
--run_awq --dump_awq awq_cache/llama3-8b-w4-g128.pt
Traceback (most recent call last):
File "/root/miniconda3/envs/awq/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/root/miniconda3/envs/awq/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/workspace/llm-awq/awq/entry.py", line 15, in
from awq.quantize.pre_quant import run_awq, apply_awq
File "/workspace/llm-awq/awq/quantize/pre_quant.py", line 12, in
from tinychat.models import LlavaLlamaForCausalLM
File "/workspace/llm-awq/tinychat/models/init.py", line 1, in
from .falcon import FalconForCausalLM
File "/workspace/llm-awq/tinychat/models/falcon.py", line 11, in
import awq_inference_engine
ModuleNotFoundError: No module named 'awq_inference_engine'
The text was updated successfully, but these errors were encountered: