Commit
β’
8d98147
1
Parent(s):
2dd1291
Update README.md
Browse files
README.md
CHANGED
@@ -14,6 +14,8 @@ PERT is a pre-trained model based on permuted language model (PerLM) to learn te
|
|
14 |
|
15 |
Results on Chinese MRC datasets (EM/F1):
|
16 |
|
|
|
|
|
17 |
| | CMRC 2018 Dev | DRCD Dev | SQuAD-Zen Dev (Answerable) | AVG |
|
18 |
| :-------: | :-----------: | :-------: | :------------------------: | :-------: |
|
19 |
| PERT-large | 73.5/90.8 | 91.2/95.7 | 63.0/79.3 | 75.9/88.6 |
|
14 |
|
15 |
Results on Chinese MRC datasets (EM/F1):
|
16 |
|
17 |
+
(We report the checkpoint that has the best AVG score)
|
18 |
+
|
19 |
| | CMRC 2018 Dev | DRCD Dev | SQuAD-Zen Dev (Answerable) | AVG |
|
20 |
| :-------: | :-----------: | :-------: | :------------------------: | :-------: |
|
21 |
| PERT-large | 73.5/90.8 | 91.2/95.7 | 63.0/79.3 | 75.9/88.6 |
|