mrchtr commited on
Commit
26e737a
1 Parent(s): 04382e8

Update model card

Browse files
Files changed (1) hide show
  1. README.md +42 -0
README.md ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - de
4
+ tags:
5
+ - cross-encoder
6
+ widget:
7
+ - text: "Was sind Lamas. Das Lama (Lama glama) ist eine Art der Kamele. Es ist in den südamerikanischen Anden verbreitet und eine vom Guanako abstammende Haustierform."
8
+ example_title: "Example Query / Paragraph"
9
+ license: apache-2.0
10
+ metrics:
11
+ - Rouge-Score
12
+ ---
13
+ # cross-encoder-mmarco-german-distilbert-base
14
+
15
+ ## Model description:
16
+ This model is a fine-tuned [cross-encoder](https://www.sbert.net/examples/training/cross-encoder/README.html) on the [MMARCO dataset](https://huggingface.co/datasets/unicamp-dl/mmarco) which is the machine translated version of the MS MARCO dataset.
17
+ As base model for the fine-tuning we use [distilbert-base-multilingual-cased](https://huggingface.co/distilbert-base-multilingual-cased)
18
+
19
+ Model input samples are tuples of the following format, either
20
+ `<query, positive_paragraph>` assigned to 1 or `<query, negative_paragraph>` assigned to 0.
21
+
22
+ The model was trained for 1 epoch.
23
+
24
+ ## Model usage
25
+ The cross-encoder model can be used like this:
26
+
27
+ ```
28
+ from sentence_transformers import CrossEncoder
29
+ model = CrossEncoder('model_name')
30
+ scores = model.predict([('Query 1', 'Paragraph 1'), ('Query 2', 'Paragraph 2')])
31
+ ```
32
+
33
+ The model will predict scores for the pairs `('Query 1', 'Paragraph 1')` and `('Query 2', 'Paragraph 2')`.
34
+
35
+ For more details on the usage of the cross-encoder models have a look into the [Sentence-Transformers](https://www.sbert.net/)
36
+
37
+ ## Model Performance:
38
+ Model evaluation was done on 2000 evaluation paragraphs of the dataset.
39
+
40
+ | Accuracy | F1-Score | Precision | Recall |
41
+ | --- | --- | --- | --- |
42
+ | 89.70 | 86.82 | 86.82 | 93.50 |