uer commited on
Commit
531062b
1 Parent(s): 50bb33c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -1
README.md CHANGED
@@ -11,7 +11,7 @@ widget:
11
 
12
  ## Model description
13
 
14
- This model is pre-trained by [UER-py](https://github.com/dbiir/UER-py/), which is introduced in [this paper](https://arxiv.org/abs/1909.05658).
15
 
16
  You can download the set of Chinese PEGASUS models either from the [UER-py Modelzoo page](https://github.com/dbiir/UER-py/wiki/Modelzoo), or via HuggingFace from the links below:
17
 
@@ -88,6 +88,13 @@ python3 scripts/convert_pegasus_from_uer_to_huggingface.py --input_model_path cl
88
  pages={241},
89
  year={2019}
90
  }
 
 
 
 
 
 
 
91
  ```
92
 
93
  [base]:https://huggingface.co/uer/pegasus-base-chinese-cluecorpussmall
 
11
 
12
  ## Model description
13
 
14
+ This model is pre-trained by [UER-py](https://github.com/dbiir/UER-py/), which is introduced in [this paper](https://arxiv.org/abs/1909.05658). Besides, the models could also be pre-trained by [TencentPretrain](https://github.com/Tencent/TencentPretrain) introduced in [this paper](https://arxiv.org/abs/2212.06385), which inherits UER-py to support models with parameters above one billion, and extends it to a multimodal pre-training framework.
15
 
16
  You can download the set of Chinese PEGASUS models either from the [UER-py Modelzoo page](https://github.com/dbiir/UER-py/wiki/Modelzoo), or via HuggingFace from the links below:
17
 
 
88
  pages={241},
89
  year={2019}
90
  }
91
+
92
+ @article{zhao2023tencentpretrain,
93
+ title={TencentPretrain: A Scalable and Flexible Toolkit for Pre-training Models of Different Modalities},
94
+ author={Zhao, Zhe and Li, Yudong and Hou, Cheng and Zhao, Jing and others},
95
+ journal={ACL 2023},
96
+ pages={217},
97
+ year={2023}
98
  ```
99
 
100
  [base]:https://huggingface.co/uer/pegasus-base-chinese-cluecorpussmall