Edit model card

https://github.com/dbiir/UER-py/wiki/Modelzoo 中的

MixedCorpus+BertEncoder(large)+MlmTarget

https://share.weiyun.com/5G90sMJ

Pre-trained on mixed large Chinese corpus. The configuration file is bert_large_config.json

引用

@article{zhao2019uer,
  title={UER: An Open-Source Toolkit for Pre-training Models},
  author={Zhao, Zhe and Chen, Hui and Zhang, Jinbin and Zhao, Xin and Liu, Tao and Lu, Wei and Chen, Xi and Deng, Haotang and Ju, Qi and Du, Xiaoyong},
  journal={EMNLP-IJCNLP 2019},
  pages={241},
  year={2019}
}
Downloads last month
2