Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

训练llama-30b模型报错是不支持llama-30b模型么? #30

Open
13416157913 opened this issue Sep 22, 2023 · 0 comments
Open

训练llama-30b模型报错是不支持llama-30b模型么? #30

13416157913 opened this issue Sep 22, 2023 · 0 comments

Comments

@13416157913
Copy link

building PretrainedFromHF tokenizer ...
Traceback (most recent call last):
File "/home/llm-deploy/Megatron-LLaMA/pretrain_llama.py", line 119, in
pretrain(train_valid_test_datasets_provider, model_provider,
File "/home/llm-deploy/Megatron-LLaMA/megatron/training.py", line 90, in pretrain
initialize_megatron(extra_args_provider=extra_args_provider,
File "/home/llm-deploy/Megatron-LLaMA/megatron/initialize.py", line 50, in initialize_megatron
set_global_variables(args)
File "/home/llm-deploy/Megatron-LLaMA/megatron/global_vars.py", line 92, in set_global_variables
_ = _build_tokenizer(args)
File "/home/llm-deploy/Megatron-LLaMA/megatron/global_vars.py", line 125, in _build_tokenizer
_GLOBAL_TOKENIZER = build_tokenizer(args)
File "/home/llm-deploy/Megatron-LLaMA/megatron/tokenizer/tokenizer.py", line 46, in build_tokenizer
tokenizer = _AutoTokenizer(args.tokenizer_name_or_path, vocab_extra_ids=args.vocab_extra_ids)
File "/home/llm-deploy/Megatron-LLaMA/megatron/tokenizer/tokenizer.py", line 554, in init
self.tokenizer = AutoTokenizer.from_pretrained(tokenizer_name_or_path, **hf_tokenizer_kwargs)
File "/home/llm-deploy/anaconda3/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 652, in from_pretrained
tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs)
File "/home/llm-deploy/anaconda3/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 496, in get_tokenizer_config
resolved_config_file = cached_file(
File "/home/llm-deploy/anaconda3/lib/python3.10/site-packages/transformers/utils/hub.py", line 417, in cached_file
resolved_file = hf_hub_download(
File "/home/llm-deploy/anaconda3/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 110, in _inner_fn
validate_repo_id(arg_value)
File "/home/llm-deploy/anaconda3/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 158, in validate_repo_id
raise HFValidationError(
huggingface_hub.utils._validators.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/xxx/megatron-llama-30b-checkpoint'. Use repo_type argument if needed.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant