Joint Laboratory of HIT and iFLYTEK Research (HFL)
commited on
Commit
•
210dea9
1
Parent(s):
bfcfaed
Create README.md
Browse files
README.md
ADDED
@@ -0,0 +1,45 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
language:
|
3 |
+
- zh
|
4 |
+
license: "apache-2.0"
|
5 |
+
---
|
6 |
+
|
7 |
+
# This model is trained on 180G data, we recommend using this one than the original version.
|
8 |
+
|
9 |
+
## Chinese ELECTRA
|
10 |
+
Google and Stanford University released a new pre-trained model called ELECTRA, which has a much compact model size and relatively competitive performance compared to BERT and its variants.
|
11 |
+
For further accelerating the research of the Chinese pre-trained model, the Joint Laboratory of HIT and iFLYTEK Research (HFL) has released the Chinese ELECTRA models based on the official code of ELECTRA.
|
12 |
+
ELECTRA-small could reach similar or even higher scores on several NLP tasks with only 1/10 parameters compared to BERT and its variants.
|
13 |
+
|
14 |
+
This project is based on the official code of ELECTRA: [https://github.com/google-research/electra](https://github.com/google-research/electra)
|
15 |
+
|
16 |
+
You may also interested in,
|
17 |
+
- Chinese BERT series: https://github.com/ymcui/Chinese-BERT-wwm
|
18 |
+
- Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA
|
19 |
+
- Chinese XLNet: https://github.com/ymcui/Chinese-XLNet
|
20 |
+
- Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewer
|
21 |
+
|
22 |
+
More resources by HFL: https://github.com/ymcui/HFL-Anthology
|
23 |
+
|
24 |
+
|
25 |
+
## Citation
|
26 |
+
If you find our resource or paper is useful, please consider including the following citation in your paper.
|
27 |
+
- https://arxiv.org/abs/2004.13922
|
28 |
+
```
|
29 |
+
@inproceedings{cui-etal-2020-revisiting,
|
30 |
+
title = "Revisiting Pre-Trained Models for {C}hinese Natural Language Processing",
|
31 |
+
author = "Cui, Yiming and
|
32 |
+
Che, Wanxiang and
|
33 |
+
Liu, Ting and
|
34 |
+
Qin, Bing and
|
35 |
+
Wang, Shijin and
|
36 |
+
Hu, Guoping",
|
37 |
+
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings",
|
38 |
+
month = nov,
|
39 |
+
year = "2020",
|
40 |
+
address = "Online",
|
41 |
+
publisher = "Association for Computational Linguistics",
|
42 |
+
url = "https://www.aclweb.org/anthology/2020.findings-emnlp.58",
|
43 |
+
pages = "657--668",
|
44 |
+
}
|
45 |
+
```
|