Edit model card

Model trained on 800,000 Japanese sentences after reducing oshizo/japanese-e5-mistral-7b_slerp to 8 layers.
See this article for details(Japanese)
https://note.com/oshizo/n/n9140df790315

See intfloat/e5-mistral-7b-instruct page for model usage.

Downloads last month
23
Safetensors
Model size
1.88B params
Tensor type
F32
·
Unable to determine this model’s pipeline type. Check the docs .

Datasets used to train oshizo/japanese-e5-mistral-1.9b

Collection including oshizo/japanese-e5-mistral-1.9b