Breaking changes in transformer lib

#1
by KnutJaegersberg - opened

I hear some breaking changes require folks to switch out tokenizers. Some claim that the weights have to be reconverted from Meta weights. More so, the textgen webui has now an extra download functionality for new tokenizers, which overwrite the ones from this repo, if I get it right. It seems, one has to reconvert from Meta weights to derive weights one can fine-tune upon. Do the weights in this repo have to be reconverted?

This readme says that in textgen webui your llama tokenizers are automatically overwritten because they might be outdated.
On github in an issue people claim the weights have to be reconverted, could be you already used the right one.

https://github.com/oobabooga/text-generation-webui/blob/main/docs/LLaMA-model.md

Folks also reuploaded reconverted llama weights on hf because of that reason.
https://huggingface.co/elinas/llama-30b-hf-transformers-4.29

can you link me that issue?

Yeah I'm not affected by the outdated tokenizers. If you want just compare the files and they should match.

KnutJaegersberg changed discussion status to closed

there is some confusion on this, but if you say you did it all with the up to date, correct version, than thats fine for me.

Sign up or log in to comment