uer commited on
Commit
b0d9c99
1 Parent(s): 479e8d5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -2
README.md CHANGED
@@ -12,9 +12,9 @@ widget:
12
 
13
  ## Model description
14
 
15
- This is the set of 24 Chinese RoBERTa models pre-trained by [UER-py](https://github.com/dbiir/UER-py/), which is introduced in [this paper](https://arxiv.org/abs/1909.05658).
16
 
17
- [Turc et al.](https://arxiv.org/abs/1908.08962) have shown that the standard BERT recipe is effective on a wide range of model sizes. Following their paper, we released the 24 Chinese RoBERTa models. In order to facilitate users to reproduce the results, we used the publicly available corpus and provided all training details.
18
 
19
  You can download the 24 Chinese RoBERTa miniatures either from the [UER-py Modelzoo page](https://github.com/dbiir/UER-py/wiki/Modelzoo), or via HuggingFace from the links below:
20
 
@@ -189,6 +189,14 @@ python3 scripts/convert_bert_from_uer_to_huggingface.py --input_model_path model
189
  pages={241},
190
  year={2019}
191
  }
 
 
 
 
 
 
 
 
192
  ```
193
 
194
  [2_128]:https://huggingface.co/uer/chinese_roberta_L-2_H-128
 
12
 
13
  ## Model description
14
 
15
+ This is the set of 24 Chinese RoBERTa models pre-trained by [UER-py](https://github.com/dbiir/UER-py/), which is introduced in [this paper](https://arxiv.org/abs/1909.05658). Besides, the models could also be pre-trained by [TencentPretrain](https://github.com/Tencent/TencentPretrain) introduced in [this paper](https://arxiv.org/pdf/2212.06385.pdf), which inherits [UER-py](https://github.com/dbiir/UER-py/) to support models with parameters above one billion, and extends it to a multimodal pre-training framework.
16
 
17
+ [Turc et al.](https://arxiv.org/abs/1908.08962) have shown that the standard BERT recipe is effective on a wide range of model sizes. Following their paper, we released the 24 Chinese RoBERTa models. In order to facilitate users in reproducing the results, we used a publicly available corpus and provided all training details.
18
 
19
  You can download the 24 Chinese RoBERTa miniatures either from the [UER-py Modelzoo page](https://github.com/dbiir/UER-py/wiki/Modelzoo), or via HuggingFace from the links below:
20
 
 
189
  pages={241},
190
  year={2019}
191
  }
192
+
193
+ @article{zhao2023tencentpretrain,
194
+ title={TencentPretrain: A Scalable and Flexible Toolkit for Pre-training Models of Different Modalities},
195
+ author={Zhao, Zhe and Li, Yudong and Hou, Cheng and Zhao, Jing and others},
196
+ journal={ACL 2023},
197
+ pages={217},
198
+ year={2023}
199
+ }
200
  ```
201
 
202
  [2_128]:https://huggingface.co/uer/chinese_roberta_L-2_H-128