uer commited on
Commit
f99ee19
1 Parent(s): 62fc2e8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -1
README.md CHANGED
@@ -11,7 +11,9 @@ widget:
11
 
12
  ## Model description
13
 
14
- The model is used to generate Chinese couplets. You can download the model either from the [GPT2-Chinese Github page](https://github.com/Morizeyao/GPT2-Chinese), or via HuggingFace from the link [gpt2-chinese-couplet](https://huggingface.co/uer/gpt2-chinese-couplet).
 
 
15
 
16
  Since the parameter skip_special_tokens is used in the pipelines.py, special tokens such as [SEP], [UNK] will be deleted, the output results of Hosted inference API (right) may not be properly displayed..
17
 
@@ -89,4 +91,12 @@ python3 scripts/convert_gpt2_from_uer_to_huggingface.py --input_model_path coupl
89
  pages={241},
90
  year={2019}
91
  }
 
 
 
 
 
 
 
 
92
  ```
 
11
 
12
  ## Model description
13
 
14
+ The model is pre-trained by [UER-py](https://github.com/dbiir/UER-py/), which is introduced in [this paper](https://arxiv.org/abs/1909.05658). Besides, the model could also be pre-trained by [TencentPretrain](https://github.com/Tencent/TencentPretrain) introduced in [this paper](https://arxiv.org/abs/2212.06385), which inherits UER-py to support models with parameters above one billion, and extends it to a multimodal pre-training framework.
15
+
16
+ The model is used to generate Chinese couplets. You can download the model from the [UER-py Modelzoo page](https://github.com/dbiir/UER-py/wiki/Modelzoo), or [GPT2-Chinese Github page](https://github.com/Morizeyao/GPT2-Chinese), or via HuggingFace from the link [gpt2-chinese-couplet](https://huggingface.co/uer/gpt2-chinese-couplet).
17
 
18
  Since the parameter skip_special_tokens is used in the pipelines.py, special tokens such as [SEP], [UNK] will be deleted, the output results of Hosted inference API (right) may not be properly displayed..
19
 
 
91
  pages={241},
92
  year={2019}
93
  }
94
+
95
+ @article{zhao2023tencentpretrain,
96
+ title={TencentPretrain: A Scalable and Flexible Toolkit for Pre-training Models of Different Modalities},
97
+ author={Zhao, Zhe and Li, Yudong and Hou, Cheng and Zhao, Jing and others},
98
+ journal={ACL 2023},
99
+ pages={217},
100
+ year={2023}
101
+ }
102
  ```