antoinelouis commited on
Commit
9127425
1 Parent(s): eb3fd3d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -9
README.md CHANGED
@@ -65,15 +65,15 @@ We evaluated the model on 500 random queries from the mMARCO-fr train set (which
65
 
66
  Below, we compare the model performance with other cross-encoder models fine-tuned on the same dataset. We report the R-precision (RP), mean reciprocal rank (MRR), and recall at various cut-offs (R@k).
67
 
68
- | | model | RP | MRR@10 | R@10 (↑) | R@20 | R@50 | R@100 |
69
- |---:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------:|---------:|------------:|------------:|------------:|-------------:|
70
- | 1 | [crossencoder-camembert-base-mmarcoFR](https://huggingface.co/antoinelouis/crossencoder-camembert-base-mmarcoFR) | 35.65 | 50.44 | 82.95 | 91.50 | 96.80 | 98.80 |
71
- | 2 | [crossencoder-mMiniLMv2-L12-H384-distilled-from-XLMR-Large-mmarcoFR](https://huggingface.co/antoinelouis/crossencoder-mMiniLMv2-L12-H384-distilled-from-XLMR-Large-mmarcoFR) | 34.37 | 51.01 | 82.23 | 90.60 | 96.45 | 98.40 |
72
- | 3 | [crossencoder-mmarcoFR-mMiniLMv2-L12-H384-v1-mmarcoFR](https://huggingface.co/antoinelouis/crossencoder-mmarcoFR-mMiniLMv2-L12-H384-v1-mmarcoFR) | 34.22 | 49.20 | 81.70 | 90.90 | 97.10 | 98.90 |
73
- | 4 | [crossencoder-mpnet-base-mmarcoFR](https://huggingface.co/antoinelouis/crossencoder-mpnet-base-mmarcoFR) | 29.68 | 46.13 | 80.45 | 87.90 | 93.15 | 96.60 |
74
- | 5 | [crossencoder-distilcamembert-base-mmarcoFR](https://huggingface.co/antoinelouis/crossencoder-distilcamembert-base-mmarcoFR) | 27.28 | 43.71 | 80.30 | 89.10 | 95.55 | 98.60 |
75
- | 6 | [crossencoder-electra-base-french-europeana-cased-discriminator-mmarcoFR](https://huggingface.co/antoinelouis/crossencoder-electra-base-french-europeana-cased-discriminator-mmarcoFR) | 28.32 | 45.28 | 79.22 | 87.15 | 93.15 | 95.75 |
76
- | 7 | **crossencoder-mMiniLMv2-L6-H384-distilled-from-XLMR-Large-mmarcoFR** | 33.92 | 49.33 | 79.00 | 88.35 | 94.80 | 98.20 |
77
 
78
  ## Training
79
  ***
 
65
 
66
  Below, we compare the model performance with other cross-encoder models fine-tuned on the same dataset. We report the R-precision (RP), mean reciprocal rank (MRR), and recall at various cut-offs (R@k).
67
 
68
+ | | model | Size | RP | MRR@10 | R@10(↑) | R@20 | R@50 | R@100 |
69
+ |---:|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------:|-------------:|---------:|------------:|------------:|------------:|-------------:|
70
+ | 1 | [crossencoder-camembert-base-mmarcoFR](https://huggingface.co/antoinelouis/crossencoder-camembert-base-mmarcoFR) | 443MB | 35.65 | 50.44 | 82.95 | 91.50 | 96.80 | 98.80 |
71
+ | 2 | [crossencoder-mMiniLMv2-L12-H384-distilled-from-XLMR-Large-mmarcoFR](https://huggingface.co/antoinelouis/crossencoder-mMiniLMv2-L12-H384-distilled-from-XLMR-Large-mmarcoFR) | 471MB | 34.37 | 51.01 | 82.23 | 90.60 | 96.45 | 98.40 |
72
+ | 3 | [crossencoder-mmarcoFR-mMiniLMv2-L12-H384-v1-mmarcoFR](https://huggingface.co/antoinelouis/crossencoder-mmarcoFR-mMiniLMv2-L12-H384-v1-mmarcoFR) | 471MB | 34.22 | 49.20 | 81.70 | 90.90 | 97.10 | 98.90 |
73
+ | 4 | [crossencoder-mpnet-base-mmarcoFR](https://huggingface.co/antoinelouis/crossencoder-mpnet-base-mmarcoFR) | 438MB | 29.68 | 46.13 | 80.45 | 87.90 | 93.15 | 96.60 |
74
+ | 5 | [crossencoder-distilcamembert-base-mmarcoFR](https://huggingface.co/antoinelouis/crossencoder-distilcamembert-base-mmarcoFR) | 272MB | 27.28 | 43.71 | 80.30 | 89.10 | 95.55 | 98.60 |
75
+ | 6 | [crossencoder-electra-base-french-europeana-cased-discriminator-mmarcoFR](https://huggingface.co/antoinelouis/crossencoder-electra-base-french-europeana-cased-discriminator-mmarcoFR) | 443MB | 28.32 | 45.28 | 79.22 | 87.15 | 93.15 | 95.75 |
76
+ | 7 | **crossencoder-mMiniLMv2-L6-H384-distilled-from-XLMR-Large-mmarcoFR** | 428MB | 33.92 | 49.33 | 79.00 | 88.35 | 94.80 | 98.20 |
77
 
78
  ## Training
79
  ***