What is the max_token length?
Hi,
what it the max token lenght for this embedding model? Is it also 512 like intfloat/multilingual-e5-large-instruct?
Thanks & regards
Hello mox,
thanks for taking an interest in the model. You are right about the context length of the model. It is the standard 512 tokens.
Just a word of caution: this model is based on an older version of e5, and the instruct version you've mentioned outperforms this model in most practical scenarios, in English as well as in German.
I'm planning a German-specific adaptation of the instruct version using synthetic data, but that'll probably take a while.
Regards,
Daniel
Thanks Daniel! Nice to know, then I will continue using the e5 instruct model for now for my german usecase :) Or do you have another favorite OS-model for the german language?