huggingface_hub.utils._validators.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name'

#13
by hwz05 - opened

when i use snapshot_down to download model codellama/CodeLlama-13b-Instruct-hf to local path "/data/home/zanehu/hwz_local/DB-GPT-Hub/dbgpt_hub/ft_local/codellama/CodeLlama-12b-Instruct-hf" ,then update file args in model_name_or_path of "dbgpt_hub/scripts/train_sft.sh" model_name_or_path="/data/home/zanehu/hwz_local/DB-GPT-Hub/dbgpt_hub/ft_local/codellama/CodeLlama-12b-Instruct-hf",but i still appear error below:
W&B offline. Running your script from this directory will only write metadata locally. Use wandb disabled to completely turn off W&B.
/data/home/zanehu/anaconda3/envs/dbgpt_hub/lib/python3.10/site-packages/transformers/deepspeed.py:23: FutureWarning: transformers.deepspeed module is deprecated and will be removed in a future version. Please import deepspeed modules directly from transformers.integrations
warnings.warn(
[INFO|training_args.py:1345] 2024-02-23 11:10:27,627 >> Found safetensors installation, but --save_safetensors=False. Safetensors should be a preferred weights saving format due to security and performance reasons. If your model cannot be saved by safetensors please feel free to open an issue at https://github.com/huggingface/safetensors!
[INFO|training_args.py:1798] 2024-02-23 11:10:27,628 >> PyTorch: setting up devices
/data/home/zanehu/anaconda3/envs/dbgpt_hub/lib/python3.10/site-packages/transformers/training_args.py:1711: FutureWarning: --push_to_hub_token is deprecated and will be removed in version 5 of πŸ€— Transformers. Use --hub_token instead.
warnings.warn(
/data/home/zanehu/anaconda3/envs/dbgpt_hub/lib/python3.10/site-packages/datasets/load.py:2089: FutureWarning: 'use_auth_token' was deprecated in favor of 'token' in version 2.14.0 and will be removed in 3.0.0.
You can remove this warning by passing 'token=None' instead.
warnings.warn(
Using custom data configuration default-e3c6bc7f485aed74
Loading Dataset Infos from /data/home/zanehu/anaconda3/envs/dbgpt_hub/lib/python3.10/site-packages/datasets/packaged_modules/json
Overwrite dataset info from restored data version if exists.
Loading Dataset info from /data/home/zanehu/.cache/huggingface/datasets/json/default-e3c6bc7f485aed74/0.0.0/8bb11242116d547c741b2e8a1f18598ffdd40a1d4f2a2872c7a28b697434bc96
Found cached dataset json (/data/home/zanehu/.cache/huggingface/datasets/json/default-e3c6bc7f485aed74/0.0.0/8bb11242116d547c741b2e8a1f18598ffdd40a1d4f2a2872c7a28b697434bc96)
Loading Dataset info from /data/home/zanehu/.cache/huggingface/datasets/json/default-e3c6bc7f485aed74/0.0.0/8bb11242116d547c741b2e8a1f18598ffdd40a1d4f2a2872c7a28b697434bc96
Traceback (most recent call last):
File "/data/home/zanehu/hwz_local/DB-GPT-Hub/dbgpt_hub/train/sft_train.py", line 172, in
train()
File "/data/home/zanehu/hwz_local/DB-GPT-Hub/dbgpt_hub/train/sft_train.py", line 149, in train
run_sft(
File "/data/home/zanehu/hwz_local/DB-GPT-Hub/dbgpt_hub/train/sft_train.py", line 48, in run_sft
model, tokenizer = load_model_and_tokenizer(
File "/data/home/zanehu/hwz_local/DB-GPT-Hub/dbgpt_hub/llm_base/load_tokenizer.py", line 175, in load_model_and_tokenizer
tokenizer = AutoTokenizer.from_pretrained(
File "/data/home/zanehu/anaconda3/envs/dbgpt_hub/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 701, in from_pretrained
tokenizer_config = get_tokenizer_config(pretrained_model_name_or_path, **kwargs)
File "/data/home/zanehu/anaconda3/envs/dbgpt_hub/lib/python3.10/site-packages/transformers/models/auto/tokenization_auto.py", line 534, in get_tokenizer_config
resolved_config_file = cached_file(
File "/data/home/zanehu/anaconda3/envs/dbgpt_hub/lib/python3.10/site-packages/transformers/utils/hub.py", line 429, in cached_file
resolved_file = hf_hub_download(
File "/data/home/zanehu/anaconda3/envs/dbgpt_hub/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 110, in _inner_fn
validate_repo_id(arg_value)
File "/data/home/zanehu/anaconda3/envs/dbgpt_hub/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 158, in validate_repo_id
raise HFValidationError(
huggingface_hub.utils._validators.HFValidationError: Repo id must be in the form 'repo_name' or 'namespace/repo_name': '/data/home/zanehu/hwz_local/DB-GPT-Hub/dbgpt_hub/ft_local/codellama/CodeLlama-12b-Instruct-hf'. Use repo_type argument if needed.

Sign up or log in to comment