uer commited on
Commit
6c5ccca
1 Parent(s): 312ce0c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -8
README.md CHANGED
@@ -9,7 +9,9 @@ widget:
9
 
10
  ## Model description
11
 
12
- The model is used for named entity recognition. You can download the model either from the [UER-py Modelzoo page](https://github.com/dbiir/UER-py/wiki/Modelzoo) (in UER-py format), or via HuggingFace from the link [roberta-base-finetuned-cluener2020-chinese](https://huggingface.co/uer/roberta-base-finetuned-cluener2020-chinese).
 
 
13
 
14
  ## How to use
15
 
@@ -59,13 +61,6 @@ python3 scripts/convert_bert_token_classification_from_uer_to_huggingface.py --i
59
  ### BibTeX entry and citation info
60
 
61
  ```
62
- @article{devlin2018bert,
63
- title={BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding},
64
- author={Devlin, Jacob and Chang, Ming-Wei and Lee, Kenton and Toutanova, Kristina},
65
- journal={arXiv preprint arXiv:1810.04805},
66
- year={2018}
67
- }
68
-
69
  @article{liu2019roberta,
70
  title={Roberta: A robustly optimized bert pretraining approach},
71
  author={Liu, Yinhan and Ott, Myle and Goyal, Naman and Du, Jingfei and Joshi, Mandar and Chen, Danqi and Levy, Omer and Lewis, Mike and Zettlemoyer, Luke and Stoyanov, Veselin},
@@ -87,4 +82,11 @@ python3 scripts/convert_bert_token_classification_from_uer_to_huggingface.py --i
87
  pages={241},
88
  year={2019}
89
  }
 
 
 
 
 
 
 
90
  ```
 
9
 
10
  ## Model description
11
 
12
+ The model is used for named entity recognition. It is fine-tuned by [UER-py](https://github.com/dbiir/UER-py/), which is introduced in [this paper](https://arxiv.org/abs/1909.05658). Besides, the model could also be fine-tuned by [TencentPretrain](https://github.com/Tencent/TencentPretrain) introduced in [this paper](https://arxiv.org/abs/2212.06385), which inherits UER-py to support models with parameters above one billion, and extends it to a multimodal pre-training framework.
13
+
14
+ You can download the model either from the [UER-py Modelzoo page](https://github.com/dbiir/UER-py/wiki/Modelzoo), or via HuggingFace from the link [roberta-base-finetuned-cluener2020-chinese](https://huggingface.co/uer/roberta-base-finetuned-cluener2020-chinese).
15
 
16
  ## How to use
17
 
 
61
  ### BibTeX entry and citation info
62
 
63
  ```
 
 
 
 
 
 
 
64
  @article{liu2019roberta,
65
  title={Roberta: A robustly optimized bert pretraining approach},
66
  author={Liu, Yinhan and Ott, Myle and Goyal, Naman and Du, Jingfei and Joshi, Mandar and Chen, Danqi and Levy, Omer and Lewis, Mike and Zettlemoyer, Luke and Stoyanov, Veselin},
 
82
  pages={241},
83
  year={2019}
84
  }
85
+
86
+ @article{zhao2023tencentpretrain,
87
+ title={TencentPretrain: A Scalable and Flexible Toolkit for Pre-training Models of Different Modalities},
88
+ author={Zhao, Zhe and Li, Yudong and Hou, Cheng and Zhao, Jing and others},
89
+ journal={ACL 2023},
90
+ pages={217},
91
+ year={2023}
92
  ```