Joint Laboratory of HIT and iFLYTEK Research (HFL) commited on
Commit
ada4159
1 Parent(s): ee987af

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +42 -0
README.md ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - zh
4
+ license: "apache-2.0"
5
+ ---
6
+
7
+ ## Chinese Pre-Trained XLNet
8
+ This project provides a XLNet pre-training model for Chinese, which aims to enrich Chinese natural language processing resources and provide a variety of Chinese pre-training model selection.
9
+ We welcome all experts and scholars to download and use this model.
10
+
11
+ This project is based on CMU/Google official XLNet: https://github.com/zihangdai/xlnet
12
+
13
+ You may also interested in,
14
+ - Chinese BERT series: https://github.com/ymcui/Chinese-BERT-wwm
15
+ - Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA
16
+ - Chinese XLNet: https://github.com/ymcui/Chinese-XLNet
17
+ - Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewer
18
+
19
+ More resources by HFL: https://github.com/ymcui/HFL-Anthology
20
+
21
+
22
+ ## Citation
23
+ If you find our resource or paper is useful, please consider including the following citation in your paper.
24
+ - https://arxiv.org/abs/2004.13922
25
+ ```
26
+ @inproceedings{cui-etal-2020-revisiting,
27
+ title = "Revisiting Pre-Trained Models for {C}hinese Natural Language Processing",
28
+ author = "Cui, Yiming and
29
+ Che, Wanxiang and
30
+ Liu, Ting and
31
+ Qin, Bing and
32
+ Wang, Shijin and
33
+ Hu, Guoping",
34
+ booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings",
35
+ month = nov,
36
+ year = "2020",
37
+ address = "Online",
38
+ publisher = "Association for Computational Linguistics",
39
+ url = "https://www.aclweb.org/anthology/2020.findings-emnlp.58",
40
+ pages = "657--668",
41
+ }
42
+ ```