uer commited on
Commit
2814160
1 Parent(s): 13951c7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -1
README.md CHANGED
@@ -36,7 +36,7 @@ You can use this model directly with a pipeline for token classification :
36
 
37
  ## Training procedure
38
 
39
- The model is fine-tuned by [UER-py](https://github.com/dbiir/UER-py/) on [Tencent Cloud TI-ONE](https://cloud.tencent.com/product/tione/). We fine-tune five epochs with a sequence length of 512 on the basis of the pre-trained model [chinese_roberta_L-12_H-768](https://huggingface.co/uer/chinese_roberta_L-12_H-768). At the end of each epoch, the model is saved when the best performance on development set is achieved.
40
 
41
  ```
42
  python3 run_ner.py --pretrained_model_path models/cluecorpussmall_roberta_base_seq512_model.bin-250000 \
@@ -74,6 +74,13 @@ python3 scripts/convert_bert_token_classification_from_uer_to_huggingface.py --i
74
  year={2019}
75
  }
76
 
 
 
 
 
 
 
 
77
  @article{zhao2019uer,
78
  title={UER: An Open-Source Toolkit for Pre-training Models},
79
  author={Zhao, Zhe and Chen, Hui and Zhang, Jinbin and Zhao, Xin and Liu, Tao and Lu, Wei and Chen, Xi and Deng, Haotang and Ju, Qi and Du, Xiaoyong},
 
36
 
37
  ## Training procedure
38
 
39
+ The model is fine-tuned by [UER-py](https://github.com/dbiir/UER-py/) on [Tencent Cloud](https://cloud.tencent.com/). We fine-tune five epochs with a sequence length of 512 on the basis of the pre-trained model [chinese_roberta_L-12_H-768](https://huggingface.co/uer/chinese_roberta_L-12_H-768). At the end of each epoch, the model is saved when the best performance on development set is achieved.
40
 
41
  ```
42
  python3 run_ner.py --pretrained_model_path models/cluecorpussmall_roberta_base_seq512_model.bin-250000 \
 
74
  year={2019}
75
  }
76
 
77
+ @article{xu2020cluener2020,
78
+ title={CLUENER2020: Fine-grained Name Entity Recognition for Chinese},
79
+ author={Xu, Liang and Dong, Qianqian and Yu, Cong and Tian, Yin and Liu, Weitang and Li, Lu and Zhang, Xuanwei},
80
+ journal={arXiv preprint arXiv:2001.04351},
81
+ year={2020}
82
+ }
83
+
84
  @article{zhao2019uer,
85
  title={UER: An Open-Source Toolkit for Pre-training Models},
86
  author={Zhao, Zhe and Chen, Hui and Zhang, Jinbin and Zhao, Xin and Liu, Tao and Lu, Wei and Chen, Xi and Deng, Haotang and Ju, Qi and Du, Xiaoyong},