ziqingyang
commited on
Commit
β’
8ae701d
1
Parent(s):
809dc69
Upload README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,32 @@
|
|
1 |
---
|
2 |
-
|
|
|
|
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
+
language:
|
3 |
+
- zh
|
4 |
+
license: "apache-2.0"
|
5 |
---
|
6 |
+
|
7 |
+
## A Chinese MRC model built on Chinese PERT-large
|
8 |
+
|
9 |
+
**Please use `BertForQuestionAnswering` to load this model!**
|
10 |
+
|
11 |
+
This is a Chinese machine reading comprehension (MRC) model built on PERT-large and fine-tuned on a mixture of Chinese MRC datasets.
|
12 |
+
|
13 |
+
PERT is a pre-trained model based on permuted language model (PerLM) to learn text semantic information in a self-supervised manner without introducing the mask tokens [MASK]. It yields competitive results on in tasks such as reading comprehension and sequence labeling.
|
14 |
+
|
15 |
+
Results on Chinese MRC datasets (EM/F1):
|
16 |
+
|
17 |
+
| | CMRC 2018 Dev | DRCD Dev | SQuAD-Zen Dev (Answerable) | AVG |
|
18 |
+
| :-------: | :-----------: | :-------: | :------------------------: | :-------: |
|
19 |
+
| PERT-large | 73.5/90.8 | 91.2/95.7 | 63.0/79.3 | 75.9/88.6 |
|
20 |
+
|
21 |
+
Please visit our GitHub repo for more information: https://github.com/ymcui/PERT
|
22 |
+
|
23 |
+
You may also be interested in,
|
24 |
+
|
25 |
+
Chinese Minority Languages CINO: https://github.com/ymcui/Chinese-Minority-PLM
|
26 |
+
Chinese MacBERT: https://github.com/ymcui/MacBERT
|
27 |
+
Chinese BERT series: https://github.com/ymcui/Chinese-BERT-wwm
|
28 |
+
Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA
|
29 |
+
Chinese XLNet: https://github.com/ymcui/Chinese-XLNet
|
30 |
+
Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewer
|
31 |
+
|
32 |
+
More resources by HFL: https://github.com/ymcui/HFL-Anthology
|