[Question] by any chance is it compatible with any other tokenizers other than meta-llama/Llama-2-7b-hf?

#16
by helltester - opened

I wanted to know that by any chance is this model compatible with any other tokenizers out there?

Right. This seems to completely destroy the value of an otherwise very open model. Super strange choice.

Anyone getting good results with something else for the tokeniser, that is actually open? Otherwise, Apple, this means I can't use this model. Great effort though. Sad if this spoils it.

Apple org

I have not tried, but maybe you can try the tokenizer from MLX Community

https://huggingface.co/mlx-community/OpenELM-270M

Apple org
β€’
edited May 1

Any redistribution of the Llama 2 tokenizer will work. For example, this one: https://huggingface.co/NousResearch/Llama-2-7b-chat-hf. You can use it with tokenizer = AutoTokenizer.from_pretrained("NousResearch/Llama-2-7b-chat-hf"). And also the one posted above by @sacmehta .

So why don't you bundle a llama 2 tokenizer with your models? I don't think it is prohibited, since a lot of other models have it.

Sign up or log in to comment