These upload files are lack some files?

#3
by guoqingkong - opened

7b-v0 model have the 9 files, but this v1.1 only 7. so I get the prompt under:
OSError: Can't load tokenizer for 'lmsys/vicuna-7b-delta-v1.1'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure 'lmsys/vicuna-7b-delta-v1.1' is the correct path to a directory containing all relevant files for a LlamaTokenizer tokenizer.

I use the 7b-v0 is OK! please check it , thanks.

Could you please provide your complete run command with all parameters so that I can identify the issue? It seems like you might be missing the correct parameter for --base-model-path.

i also meet this problem ,likely has no 'special_tokens_map.json tokenizer_config.json tokenizer.model'

You will find special tokens in you model directory.

lmzheng changed discussion status to closed

Sign up or log in to comment