Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

请问模型 Baichuan2-13B-Chat-4bits 支持MAC吗? #404

Open
IguoChan opened this issue Apr 22, 2024 · 0 comments
Open

请问模型 Baichuan2-13B-Chat-4bits 支持MAC吗? #404

IguoChan opened this issue Apr 22, 2024 · 0 comments

Comments

@IguoChan
Copy link

请问模型支持MAC吗?纯AI小白,照着教程和网上各种解答方式试过以后孩纸报错如下,不知道是不是MAC M2芯片的问题。

/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/transformers/utils/generic.py:311: UserWarning: torch.utils._pytree._register_pytree_node is deprecated. Please use torch.utils._pytree.register_pytree_node instead.
  torch.utils._pytree._register_pytree_node(
Xformers is not installed correctly. If you want to use memory_efficient_attention to accelerate training use the following command to install Xformers
pip install xformers.
/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/bitsandbytes/cextension.py:34: UserWarning: The installed version of bitsandbytes was compiled without GPU support. 8-bit optimizers, 8-bit multiplication, and GPU quantization are unavailable.
  warn("The installed version of bitsandbytes was compiled without GPU support. "
'NoneType' object has no attribute 'cadam32bit_grad_fp32'


FP4 quantization state not initialized. Please call .cuda() or .to(device) on the LinearFP4 layer first.
Traceback (most recent call last):
  File "/Users/chenyiguo/workspace/aidev/app.py", line 16, in <module>
    response = model.chat(tokenizer, messages)
  File "/Users/chenyiguo/.cache/huggingface/modules/transformers_modules/Baichuan2-13B-Chat-4bits/modeling_baichuan.py", line 823, in chat
    outputs = self.generate(input_ids, generation_config=generation_config)
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/torch/utils/_contextlib.py", line 115, in decorate_context
    return func(*args, **kwargs)
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/transformers/generation/utils.py", line 1648, in generate
    return self.sample(
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/transformers/generation/utils.py", line 2730, in sample
    outputs = self(
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/torch/nn/modules/module.py", line 1511, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/torch/nn/modules/module.py", line 1520, in _call_impl
    return forward_call(*args, **kwargs)
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/accelerate/hooks.py", line 166, in new_forward
    output = module._old_forward(*args, **kwargs)
  File "/Users/chenyiguo/.cache/huggingface/modules/transformers_modules/Baichuan2-13B-Chat-4bits/modeling_baichuan.py", line 691, in forward
    outputs = self.model(
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/torch/nn/modules/module.py", line 1511, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/torch/nn/modules/module.py", line 1520, in _call_impl
    return forward_call(*args, **kwargs)
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/accelerate/hooks.py", line 166, in new_forward
    output = module._old_forward(*args, **kwargs)
  File "/Users/chenyiguo/.cache/huggingface/modules/transformers_modules/Baichuan2-13B-Chat-4bits/modeling_baichuan.py", line 467, in forward
    layer_outputs = decoder_layer(
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/torch/nn/modules/module.py", line 1511, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/torch/nn/modules/module.py", line 1520, in _call_impl
    return forward_call(*args, **kwargs)
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/accelerate/hooks.py", line 166, in new_forward
    output = module._old_forward(*args, **kwargs)
  File "/Users/chenyiguo/.cache/huggingface/modules/transformers_modules/Baichuan2-13B-Chat-4bits/modeling_baichuan.py", line 244, in forward
    hidden_states, self_attn_weights, present_key_value = self.self_attn(
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/torch/nn/modules/module.py", line 1511, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/torch/nn/modules/module.py", line 1520, in _call_impl
    return forward_call(*args, **kwargs)
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/accelerate/hooks.py", line 166, in new_forward
    output = module._old_forward(*args, **kwargs)
  File "/Users/chenyiguo/.cache/huggingface/modules/transformers_modules/Baichuan2-13B-Chat-4bits/modeling_baichuan.py", line 147, in forward
    proj = self.W_pack(hidden_states)
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/torch/nn/modules/module.py", line 1511, in _wrapped_call_impl
    return self._call_impl(*args, **kwargs)
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/torch/nn/modules/module.py", line 1520, in _call_impl
    return forward_call(*args, **kwargs)
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/accelerate/hooks.py", line 166, in new_forward
    output = module._old_forward(*args, **kwargs)
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/bitsandbytes/nn/modules.py", line 248, in forward
    out = bnb.matmul_4bit(x, self.weight.t(), bias=bias, quant_state=self.weight.quant_state)
  File "/Users/chenyiguo/Library/Python/3.9/lib/python/site-packages/bitsandbytes/autograd/_functions.py", line 567, in matmul_4bit
    assert quant_state is not None
AssertionError
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant