Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No module named 'awq_inference_engine' #181

Open
Alpslee opened this issue May 6, 2024 · 2 comments
Open

No module named 'awq_inference_engine' #181

Alpslee opened this issue May 6, 2024 · 2 comments

Comments

@Alpslee
Copy link

Alpslee commented May 6, 2024

python -m awq.entry --model_path awq_cache/llama3-8b-w4-g128.pt
--w_bit 4 --q_group_size 128
--run_awq --dump_awq awq_cache/llama3-8b-w4-g128.pt
Traceback (most recent call last):
File "/root/miniconda3/envs/awq/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/root/miniconda3/envs/awq/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/workspace/llm-awq/awq/entry.py", line 15, in
from awq.quantize.pre_quant import run_awq, apply_awq
File "/workspace/llm-awq/awq/quantize/pre_quant.py", line 12, in
from tinychat.models import LlavaLlamaForCausalLM
File "/workspace/llm-awq/tinychat/models/init.py", line 1, in
from .falcon import FalconForCausalLM
File "/workspace/llm-awq/tinychat/models/falcon.py", line 11, in
import awq_inference_engine
ModuleNotFoundError: No module named 'awq_inference_engine'

@ys-2020
Copy link
Contributor

ys-2020 commented May 6, 2024

Hi @Alpslee , thank you for your interest in AWQ & TinyChat. Did you run the following commands to build the awq_inference_engine?

cd awq/kernels
python setup.py install

@Alpslee
Copy link
Author

Alpslee commented May 7, 2024

Hi @Alpslee , thank you for your interest in AWQ & TinyChat. Did you run the following commands to build the awq_inference_engine?

cd awq/kernels
python setup.py install

i forgot this step,thanks. now i got new error:
from transformers.models.gemma.modeling_gemma import GemmaRMSNorm
ModuleNotFoundError: No module named 'transformers.models.gemma'
i tried to update transformers but got this:
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
awq 0.1.0 requires transformers==4.36.2, but you have transformers 4.40.2 which is incompatible.
vila 1.0.0 requires transformers==4.36.2, but you have transformers 4.40.2 which is incompatible.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants