multilingual-e5-large-mlx
This model was converted to MLX format from intfloat/multilingual-e5-large
.
Refer to the original model card for more details on the model.
Use with mlx
pip install mlx
git clone https://github.com/ml-explore/mlx-examples.git
cd mlx-examples/llms/hf_llm
python generate.py --model mlx-community/multilingual-e5-large-mlx --prompt "My name is"
- Downloads last month
- 22
Space using mlx-community/multilingual-e5-large-mlx 1
Evaluation results
- accuracy on MTEB AmazonCounterfactualClassification (en)test set self-reported79.060
- ap on MTEB AmazonCounterfactualClassification (en)test set self-reported43.487
- f1 on MTEB AmazonCounterfactualClassification (en)test set self-reported73.327
- accuracy on MTEB AmazonCounterfactualClassification (de)test set self-reported71.221
- ap on MTEB AmazonCounterfactualClassification (de)test set self-reported81.558
- f1 on MTEB AmazonCounterfactualClassification (de)test set self-reported69.283
- accuracy on MTEB AmazonCounterfactualClassification (en-ext)test set self-reported80.420
- ap on MTEB AmazonCounterfactualClassification (en-ext)test set self-reported29.349
- f1 on MTEB AmazonCounterfactualClassification (en-ext)test set self-reported67.625
- accuracy on MTEB AmazonCounterfactualClassification (ja)test set self-reported77.837
- ap on MTEB AmazonCounterfactualClassification (ja)test set self-reported26.558
- f1 on MTEB AmazonCounterfactualClassification (ja)test set self-reported64.966
- accuracy on MTEB AmazonPolarityClassificationtest set self-reported93.490
- ap on MTEB AmazonPolarityClassificationtest set self-reported90.988
- f1 on MTEB AmazonPolarityClassificationtest set self-reported93.486
- accuracy on MTEB AmazonReviewsClassification (en)test set self-reported47.564
- f1 on MTEB AmazonReviewsClassification (en)test set self-reported46.751
- accuracy on MTEB AmazonReviewsClassification (de)test set self-reported45.400
- f1 on MTEB AmazonReviewsClassification (de)test set self-reported44.172
- accuracy on MTEB AmazonReviewsClassification (es)test set self-reported43.068
- f1 on MTEB AmazonReviewsClassification (es)test set self-reported42.382
- accuracy on MTEB AmazonReviewsClassification (fr)test set self-reported41.890
- f1 on MTEB AmazonReviewsClassification (fr)test set self-reported40.844
- accuracy on MTEB AmazonReviewsClassification (ja)test set self-reported40.120
- f1 on MTEB AmazonReviewsClassification (ja)test set self-reported39.523