LlamaTokenizer and LlamaTokenizerFast

#4
by chim2000 - opened

I used mmlu as eval dataset for deepseek-7b base, but the result acc = 0. After switching to LlamaTokenizer as tokenizer, acc comes to 0.249, which proves nothing goes wrong with the inference codes. Could you explain the difference between LlamaTokenizer and LlamaTokenizerFast, or please provide me with your prompts?

Sign up or log in to comment