Philip May commited on
Commit
04d4045
1 Parent(s): 968a88d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -11
README.md CHANGED
@@ -67,24 +67,18 @@ This model is trained on the following datasets:
67
 
68
  | Model | rouge1 | rouge2 | rougeL | rougeLsum
69
  |-------|--------|--------|--------|----------
70
- | mT5-small-sum-de-en-01 (this) | 21.7336 | 7.2614 | 17.1323 | 19.3977
71
  | [ml6team/mt5-small-german-finetune-mlsum](https://huggingface.co/ml6team/mt5-small-german-finetune-mlsum) | 18.3607 | 5.3604 | 14.5456 | 16.1946
72
-
73
- ## Evaluation on MLSUM German Test Set (5 beams)
74
-
75
- | Model | rouge1 | rouge2 | rougeL | rougeLsum
76
- |-------|--------|--------|--------|----------
77
- | mT5-small-sum-de-en-01 (this) | 22.6018 | 7.8047 | 17.1363 | 19.719
78
- | [ml6team/mt5-small-german-finetune-mlsum](https://huggingface.co/ml6team/mt5-small-german-finetune-mlsum) | 19.6166 | 5.8818 | 14.74 | 16.889
79
 
80
  ## Evaluation on CNN Daily English Test Set (no beams)
81
 
82
  | Model | rouge1 | rouge2 | rougeL | rougeLsum
83
  |-------|--------|--------|--------|----------
84
- | mT5-small-sum-de-en-01 (this) | 37.6339 | 16.5317 | 27.1418 | 34.9951
85
- | [mrm8488/t5-base-finetuned-summarize-news](https://huggingface.co/mrm8488/t5-base-finetuned-summarize-news) | 37.576 | 14.7389 | 24.0254 | 34.4634
86
- | [facebook/bart-large-xsum](https://huggingface.co/facebook/bart-large-xsum) | 28.5374 | 9.8565 | 19.4829 | 24.7364
87
  | [sshleifer/distilbart-xsum-12-6](https://huggingface.co/sshleifer/distilbart-xsum-12-6) | 26.7664 | 8.8243 | 18.3703 | 23.2614
 
 
 
 
88
 
89
  ## Evaluation on Extreme Summarization (XSum) English Test Set (no beams)
90
 
 
67
 
68
  | Model | rouge1 | rouge2 | rougeL | rougeLsum
69
  |-------|--------|--------|--------|----------
 
70
  | [ml6team/mt5-small-german-finetune-mlsum](https://huggingface.co/ml6team/mt5-small-german-finetune-mlsum) | 18.3607 | 5.3604 | 14.5456 | 16.1946
71
+ | **mT5-small-sum-de-en-01 (this)** | **21.7336** | **7.2614** | **17.1323** | **19.3977**
 
 
 
 
 
 
72
 
73
  ## Evaluation on CNN Daily English Test Set (no beams)
74
 
75
  | Model | rouge1 | rouge2 | rougeL | rougeLsum
76
  |-------|--------|--------|--------|----------
 
 
 
77
  | [sshleifer/distilbart-xsum-12-6](https://huggingface.co/sshleifer/distilbart-xsum-12-6) | 26.7664 | 8.8243 | 18.3703 | 23.2614
78
+ | [facebook/bart-large-xsum](https://huggingface.co/facebook/bart-large-xsum) | 28.5374 | 9.8565 | 19.4829 | 24.7364
79
+ | [mrm8488/t5-base-finetuned-summarize-news](https://huggingface.co/mrm8488/t5-base-finetuned-summarize-news) | 37.576 | 14.7389 | 24.0254 | 34.4634
80
+ | **mT5-small-sum-de-en-01 (this)** | **37.6339** | **16.5317** | **27.1418** | **34.9951**
81
+
82
 
83
  ## Evaluation on Extreme Summarization (XSum) English Test Set (no beams)
84