Philip May
commited on
Commit
•
90de634
1
Parent(s):
36500c3
Update README.md
Browse files
README.md
CHANGED
@@ -65,17 +65,17 @@ This model is trained on the following datasets:
|
|
65 |
|
66 |
## Evaluation on MLSUM German Test Set (no beams)
|
67 |
|
68 |
-
| Model |
|
69 |
-
|
70 |
-
| mT5-small-sum-de-en-01 (this) |
|
71 |
-
| [ml6team/mt5-small-german-finetune-mlsum](https://huggingface.co/ml6team/mt5-small-german-finetune-mlsum) |
|
72 |
|
73 |
## Evaluation on MLSUM German Test Set (5 beams)
|
74 |
|
75 |
-
| Model |
|
76 |
-
|
77 |
-
| mT5-small-sum-de-en-01 (this) |
|
78 |
-
| [ml6team/mt5-small-german-finetune-mlsum](https://huggingface.co/ml6team/mt5-small-german-finetune-mlsum) |
|
79 |
|
80 |
## Evaluation on CNN Daily English Test Set (no beams)
|
81 |
|
|
|
65 |
|
66 |
## Evaluation on MLSUM German Test Set (no beams)
|
67 |
|
68 |
+
| Model | rouge1 | rouge2 | rougeL | rougeLsum
|
69 |
+
|-------|--------|--------|--------|----------
|
70 |
+
| mT5-small-sum-de-en-01 (this) | 21.7336 | 7.2614 | 17.1323 | 19.3977
|
71 |
+
| [ml6team/mt5-small-german-finetune-mlsum](https://huggingface.co/ml6team/mt5-small-german-finetune-mlsum) | 18.3607 | 5.3604 | 14.5456 | 16.1946
|
72 |
|
73 |
## Evaluation on MLSUM German Test Set (5 beams)
|
74 |
|
75 |
+
| Model | rouge1 | rouge2 | rougeL | rougeLsum
|
76 |
+
|-------|--------|--------|--------|----------
|
77 |
+
| mT5-small-sum-de-en-01 (this) | 22.6018 | 7.8047 | 17.1363 | 19.719
|
78 |
+
| [ml6team/mt5-small-german-finetune-mlsum](https://huggingface.co/ml6team/mt5-small-german-finetune-mlsum) | 19.6166 | 5.8818 | 14.74 | 16.889
|
79 |
|
80 |
## Evaluation on CNN Daily English Test Set (no beams)
|
81 |
|