Re-upload with tokenizer/config changes?

#3
by float-trip - opened

The HF team has made several updates to the main repo for this model since this repo was published. One of which, significantly, fixes the tokenizer:

nous.png

I've seen a few people getting tripped up by this issue on Discord and reddit: https://www.reddit.com/r/LocalLLaMA/comments/15hz7gl/my_finetuning_based_on_llama27bchathf_model/

NousResearch org

I dont think you need to change the model files to take advantage of changes in HF Transformers implementation of the model

Oh sorry, to clarify - I'm referring to changes the HF team has made to the model repos, not transformers.

It seems like a lot of tutorials use these mirrors, which is causing difficult-to-debug problems like the improper tokenization of </s> above.

(Out of curiosity - I noticed the mirror for the 7b chat model is 404ing now. Did you guys delete it, or was it taken down?)

NousResearch org

Oh sorry, to clarify - I'm referring to changes the HF team has made to the model repos, not transformers.

It seems like a lot of tutorials use these mirrors, which is causing difficult-to-debug problems like the improper tokenization of </s> above.

(Out of curiosity - I noticed the mirror for the 7b chat model is 404ing now. Did you guys delete it, or was it taken down?)

@LDJnr and me figured no one would want to train over that model so privated it to clean up, do you use it?

Nah, just curious. Others apparently use it (someone was asking questions about a Colab which uses it in TheBloke's Discord earlier today, and others are using it in the reddit thread I linked above) but I agree, finetuning an RLHF'd model is probably not ideal.

re: the original issue, though - lmk if it makes sense why I'm suggesting you re-mirror the updated version or if I should clarify anything

@teknium can you pls upload the 7b chat model again?

NousResearch org

@teknium can you pls upload the 7b chat model again?

ok fine

NousResearch org

done

teknium changed discussion status to closed

@teknium Thank you very much

Sign up or log in to comment