antoinelouis
commited on
Commit
•
f44fd5f
1
Parent(s):
1ecfd46
Update README.md
Browse files
README.md
CHANGED
@@ -45,7 +45,7 @@ model-index:
|
|
45 |
|
46 |
# biencoder-distilcamembert-mmarcoFR
|
47 |
|
48 |
-
This is a dense single-vector bi-encoder model
|
49 |
|
50 |
## Usage
|
51 |
|
@@ -122,24 +122,11 @@ similarity = q_embeddings @ p_embeddings.T
|
|
122 |
print(similarity)
|
123 |
```
|
124 |
|
125 |
-
***
|
126 |
-
|
127 |
## Evaluation
|
128 |
|
129 |
-
|
130 |
-
|
131 |
-
|
132 |
-
|---:|:------------------------------------------------------------------------------------------------------------------------|:-------|--------:|------:|---------:|----------:|---------:|-------:|-----------:|--------:|
|
133 |
-
| 1 | [biencoder-camembert-base-mmarcoFR](https://huggingface.co/antoinelouis/biencoder-camembert-base-mmarcoFR) | 🇫🇷 | 110M | 443MB | 28.53 | 33.72 | 27.93 | 51.46 | 77.82 | 89.13 |
|
134 |
-
| 2 | [biencoder-mpnet-base-all-v2-mmarcoFR](https://huggingface.co/antoinelouis/biencoder-mpnet-base-all-v2-mmarcoFR) | 🇬🇧 | 109M | 438MB | 28.04 | 33.28 | 27.50 | 51.07 | 77.68 | 88.67 |
|
135 |
-
| 3 | **biencoder-distilcamembert-mmarcoFR** | 🇫🇷 | 68M | 272MB | 26.80 | 31.87 | 26.23 | 49.20 | 76.44 | 87.87 |
|
136 |
-
| 4 | [biencoder-MiniLM-L6-all-v2-mmarcoFR](https://huggingface.co/antoinelouis/biencoder-MiniLM-L6-all-v2-mmarcoFR) | 🇬🇧 | 23M | 91MB | 25.49 | 30.39 | 24.99 | 47.10 | 73.48 | 86.09 |
|
137 |
-
| 5 | [biencoder-mMiniLMv2-L12-mmarcoFR](https://huggingface.co/antoinelouis/biencoder-mMiniLMv2-L12-mmarcoFR) | 🇫🇷,99+ | 117M | 471MB | 24.74 | 29.41 | 24.23 | 45.40 | 71.52 | 84.42 |
|
138 |
-
| 6 | [biencoder-camemberta-base-mmarcoFR](https://huggingface.co/antoinelouis/biencoder-camemberta-base-mmarcoFR) | 🇫🇷 | 112M | 447MB | 24.78 | 29.24 | 24.23 | 44.58 | 69.59 | 82.18 |
|
139 |
-
| 7 | [biencoder-electra-base-french-mmarcoFR](https://huggingface.co/antoinelouis/biencoder-electra-base-french-mmarcoFR) | 🇫🇷 | 110M | 440MB | 23.38 | 27.97 | 22.91 | 43.50 | 68.96 | 81.61 |
|
140 |
-
| 8 | [biencoder-mMiniLMv2-L6-mmarcoFR](https://huggingface.co/antoinelouis/biencoder-mMiniLMv2-L6-mmarcoFR) | 🇫🇷,99+ | 107M | 428MB | 22.29 | 26.57 | 21.80 | 41.25 | 66.78 | 79.83 |
|
141 |
-
|
142 |
-
***
|
143 |
|
144 |
## Training
|
145 |
|
@@ -151,17 +138,15 @@ We use the French training samples from the [mMARCO](https://huggingface.co/data
|
|
151 |
|
152 |
The model is initialized from the [cmarkea/distilcamembert-base](https://huggingface.co/cmarkea/distilcamembert-base) checkpoint and optimized via the cross-entropy loss (as in [DPR](https://doi.org/10.48550/arXiv.2004.04906)) with a temperature of 0.05. It is fine-tuned on one 32GB NVIDIA V100 GPU for 20 epochs (i.e., 65.7k steps) using the AdamW optimizer with a batch size of 152, a peak learning rate of 2e-5 with warm up along the first 500 steps and linear scheduling. We set the maximum sequence lengths for both the questions and passages to 128 tokens. We use the cosine similarity to compute relevance scores.
|
153 |
|
154 |
-
***
|
155 |
-
|
156 |
## Citation
|
157 |
|
158 |
```bibtex
|
159 |
-
@online{
|
160 |
-
|
161 |
-
|
162 |
-
|
163 |
-
|
164 |
-
|
165 |
-
|
166 |
}
|
167 |
```
|
|
|
45 |
|
46 |
# biencoder-distilcamembert-mmarcoFR
|
47 |
|
48 |
+
This is a dense single-vector bi-encoder model for **French** that can be used for semantic search. The model maps queries and passages to 768-dimensional dense vectors which are used to compute relevance through cosine similarity.
|
49 |
|
50 |
## Usage
|
51 |
|
|
|
122 |
print(similarity)
|
123 |
```
|
124 |
|
|
|
|
|
125 |
## Evaluation
|
126 |
|
127 |
+
The model is evaluated on the smaller development set of [mMARCO-fr](https://ir-datasets.com/mmarco.html#mmarco/v2/fr/), which consists of 6,980 queries for a corpus of
|
128 |
+
8.8M candidate passages. We report the mean reciprocal rank (MRR), normalized discounted cumulative gainand (NDCG), mean average precision (MAP), and recall at various cut-offs (R@k).
|
129 |
+
To see how it compares to other neural retrievers in French, check out the [*DécouvrIR*](https://huggingface.co/spaces/antoinelouis/decouvrir) leaderboard.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
130 |
|
131 |
## Training
|
132 |
|
|
|
138 |
|
139 |
The model is initialized from the [cmarkea/distilcamembert-base](https://huggingface.co/cmarkea/distilcamembert-base) checkpoint and optimized via the cross-entropy loss (as in [DPR](https://doi.org/10.48550/arXiv.2004.04906)) with a temperature of 0.05. It is fine-tuned on one 32GB NVIDIA V100 GPU for 20 epochs (i.e., 65.7k steps) using the AdamW optimizer with a batch size of 152, a peak learning rate of 2e-5 with warm up along the first 500 steps and linear scheduling. We set the maximum sequence lengths for both the questions and passages to 128 tokens. We use the cosine similarity to compute relevance scores.
|
140 |
|
|
|
|
|
141 |
## Citation
|
142 |
|
143 |
```bibtex
|
144 |
+
@online{louis2024decouvrir,
|
145 |
+
author = 'Antoine Louis',
|
146 |
+
title = 'DécouvrIR: A Benchmark for Evaluating the Robustness of Information Retrieval Models in French',
|
147 |
+
publisher = 'Hugging Face',
|
148 |
+
month = 'mar',
|
149 |
+
year = '2024',
|
150 |
+
url = 'https://huggingface.co/spaces/antoinelouis/decouvrir',
|
151 |
}
|
152 |
```
|