ziqingyang
commited on
Commit
Β·
2b647ca
1
Parent(s):
4feb14b
Update README.md
Browse files
README.md
CHANGED
@@ -14,6 +14,8 @@ PERT is a pre-trained model based on permuted language model (PerLM) to learn te
|
|
14 |
|
15 |
Results on Chinese MRC datasets (EM/F1):
|
16 |
|
|
|
|
|
17 |
| | CMRC 2018 Dev | DRCD Dev | SQuAD-Zen Dev (Answerable) | AVG |
|
18 |
| :-------: | :-----------: | :-------: | :------------------------: | :-------: |
|
19 |
| PERT-base | 73.2/90.6 | 88.7/94.1 | 59.7/76.5 | 73.9/87.1 |
|
|
|
14 |
|
15 |
Results on Chinese MRC datasets (EM/F1):
|
16 |
|
17 |
+
(We report the checkpoint that has the best AVG score)
|
18 |
+
|
19 |
| | CMRC 2018 Dev | DRCD Dev | SQuAD-Zen Dev (Answerable) | AVG |
|
20 |
| :-------: | :-----------: | :-------: | :------------------------: | :-------: |
|
21 |
| PERT-base | 73.2/90.6 | 88.7/94.1 | 59.7/76.5 | 73.9/87.1 |
|