hfl-rc commited on
Commit
a6d7e0f
•
1 Parent(s): 1a74094

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +17 -1
README.md CHANGED
@@ -10,4 +10,20 @@ language:
10
  license: "apache-2.0"
11
  ---
12
 
13
- TBA
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
  license: "apache-2.0"
11
  ---
12
 
13
+
14
+ ### Chinese ELECTRA
15
+
16
+ Google and Stanford University released a new pre-trained model called ELECTRA, which has a much compact model size and relatively competitive performance compared to BERT and its variants. For further accelerating the research of the Chinese pre-trained model, the Joint Laboratory of HIT and iFLYTEK Research (HFL) has released the Chinese ELECTRA models based on the official code of ELECTRA. ELECTRA-small could reach similar or even higher scores on several NLP tasks with only 1/10 parameters compared to BERT and its variants.
17
+
18
+ This project is based on the official code of ELECTRA: https://github.com/google-research/electra
19
+
20
+ You may also interested in,
21
+
22
+ Chinese MacBERT: https://github.com/ymcui/MacBERT
23
+ Chinese BERT series: https://github.com/ymcui/Chinese-BERT-wwm
24
+ Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA
25
+ Chinese XLNet: https://github.com/ymcui/Chinese-XLNet
26
+ Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewer
27
+
28
+ More resources by HFL: https://github.com/ymcui/HFL-Anthology
29
+