Spaces:
Running
Running
kevinwang676
commited on
Commit
•
e77303a
1
Parent(s):
d771dd4
Delete README.md
Browse files
README.md
DELETED
@@ -1,57 +0,0 @@
|
|
1 |
-
---
|
2 |
-
language:
|
3 |
-
- zh
|
4 |
-
tags:
|
5 |
-
- bert
|
6 |
-
license: "apache-2.0"
|
7 |
-
---
|
8 |
-
|
9 |
-
# Please use 'Bert' related functions to load this model!
|
10 |
-
|
11 |
-
## Chinese BERT with Whole Word Masking
|
12 |
-
For further accelerating Chinese natural language processing, we provide **Chinese pre-trained BERT with Whole Word Masking**.
|
13 |
-
|
14 |
-
**[Pre-Training with Whole Word Masking for Chinese BERT](https://arxiv.org/abs/1906.08101)**
|
15 |
-
Yiming Cui, Wanxiang Che, Ting Liu, Bing Qin, Ziqing Yang, Shijin Wang, Guoping Hu
|
16 |
-
|
17 |
-
This repository is developed based on:https://github.com/google-research/bert
|
18 |
-
|
19 |
-
You may also interested in,
|
20 |
-
- Chinese BERT series: https://github.com/ymcui/Chinese-BERT-wwm
|
21 |
-
- Chinese MacBERT: https://github.com/ymcui/MacBERT
|
22 |
-
- Chinese ELECTRA: https://github.com/ymcui/Chinese-ELECTRA
|
23 |
-
- Chinese XLNet: https://github.com/ymcui/Chinese-XLNet
|
24 |
-
- Knowledge Distillation Toolkit - TextBrewer: https://github.com/airaria/TextBrewer
|
25 |
-
|
26 |
-
More resources by HFL: https://github.com/ymcui/HFL-Anthology
|
27 |
-
|
28 |
-
## Citation
|
29 |
-
If you find the technical report or resource is useful, please cite the following technical report in your paper.
|
30 |
-
- Primary: https://arxiv.org/abs/2004.13922
|
31 |
-
```
|
32 |
-
@inproceedings{cui-etal-2020-revisiting,
|
33 |
-
title = "Revisiting Pre-Trained Models for {C}hinese Natural Language Processing",
|
34 |
-
author = "Cui, Yiming and
|
35 |
-
Che, Wanxiang and
|
36 |
-
Liu, Ting and
|
37 |
-
Qin, Bing and
|
38 |
-
Wang, Shijin and
|
39 |
-
Hu, Guoping",
|
40 |
-
booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: Findings",
|
41 |
-
month = nov,
|
42 |
-
year = "2020",
|
43 |
-
address = "Online",
|
44 |
-
publisher = "Association for Computational Linguistics",
|
45 |
-
url = "https://www.aclweb.org/anthology/2020.findings-emnlp.58",
|
46 |
-
pages = "657--668",
|
47 |
-
}
|
48 |
-
```
|
49 |
-
- Secondary: https://arxiv.org/abs/1906.08101
|
50 |
-
```
|
51 |
-
@article{chinese-bert-wwm,
|
52 |
-
title={Pre-Training with Whole Word Masking for Chinese BERT},
|
53 |
-
author={Cui, Yiming and Che, Wanxiang and Liu, Ting and Qin, Bing and Yang, Ziqing and Wang, Shijin and Hu, Guoping},
|
54 |
-
journal={arXiv preprint arXiv:1906.08101},
|
55 |
-
year={2019}
|
56 |
-
}
|
57 |
-
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|