How did you changed the tokenizer without loosing the meaning of the trained mistral

#2
by AiModelsMarket - opened

Hello,
please help me understand , how can you change a tokenizer of a pretrained model and not loose the meaning of the pretraining ? Could your model now learn more easy additional languages ? Thank you for your time and help. Catalin

Some under water basket weaver created a Hebrew model without even changing the tokenizer, madness! I heard it is known as Zion_Alpha.

Sign up or log in to comment