uer commited on
Commit
06be8ae
1 Parent(s): 8634d16

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -1
README.md CHANGED
@@ -12,7 +12,9 @@ widget:
12
 
13
  ## Model description
14
 
15
- This is the set of Chinese ALBERT models pre-trained by UER-py. You can download the model either from the [UER-py Github page](https://github.com/dbiir/UER-py/), or via HuggingFace from the links below:
 
 
16
 
17
  | | Link |
18
  | -------- | :-----------------------: |
@@ -149,6 +151,13 @@ python3 scripts/convert_albert_from_uer_to_huggingface.py --input_model_path clu
149
  pages={241},
150
  year={2019}
151
  }
 
 
 
 
 
 
 
152
  ```
153
 
154
  [base]:https://huggingface.co/uer/albert-base-chinese-cluecorpussmall
 
12
 
13
  ## Model description
14
 
15
+ This is the set of Chinese ALBERT models pre-trained by [UER-py](https://github.com/dbiir/UER-py/), which is introduced in [this paper](https://arxiv.org/abs/1909.05658). Besides, the models could also be pre-trained by [TencentPretrain](https://github.com/Tencent/TencentPretrain) introduced in [this paper](https://arxiv.org/abs/2212.06385), which inherits UER-py to support models with parameters above one billion, and extends it to a multimodal pre-training framework.
16
+
17
+ You can download the model either from the [UER-py Modelzoo page](https://github.com/dbiir/UER-py/wiki/Modelzoo), or via HuggingFace from the links below:
18
 
19
  | | Link |
20
  | -------- | :-----------------------: |
 
151
  pages={241},
152
  year={2019}
153
  }
154
+
155
+ @article{zhao2023tencentpretrain,
156
+ title={TencentPretrain: A Scalable and Flexible Toolkit for Pre-training Models of Different Modalities},
157
+ author={Zhao, Zhe and Li, Yudong and Hou, Cheng and Zhao, Jing and others},
158
+ journal={ACL 2023},
159
+ pages={217},
160
+ year={2023}
161
  ```
162
 
163
  [base]:https://huggingface.co/uer/albert-base-chinese-cluecorpussmall