ernie-3.0-nano-zh / README.md
rightyonghu
init
ab6f8e7
|
raw
history blame
1.64 kB
metadata
language: zh

ERNIE-3.0-nano-zh

Introduction

ERNIE 3.0: Large-scale Knowledge Enhanced Pre-training for Language Understanding and Generation More detail: https://arxiv.org/abs/2107.02137

Released Model Info

This released pytorch model is converted from the officially released PaddlePaddle ERNIE model and a series of experiments have been conducted to check the accuracy of the conversion.

How to use

If you want to use ernie-3.0 series models, you need to add task_type_id to BERT model following this MR OR you can re-install the transformers from my changed branch.

pip uninstall transformers # optional
pip install git+https://github.com/nghuyong/transformers@add_task_type_id # reinstall

Then you can load ERNIE-3.0 model as before:

from transformers import BertTokenizer, BertModel

tokenizer = BertTokenizer.from_pretrained("nghuyong/ernie-3.0-nano-zh")
model = BertModel.from_pretrained("nghuyong/ernie-3.0-nano-zh")

Citation

@article{sun2021ernie,
  title={Ernie 3.0: Large-scale knowledge enhanced pre-training for language understanding and generation},
  author={Sun, Yu and Wang, Shuohuan and Feng, Shikun and Ding, Siyu and Pang, Chao and Shang, Junyuan and Liu, Jiaxiang and Chen, Xuyi and Zhao, Yanbin and Lu, Yuxiang and others},
  journal={arXiv preprint arXiv:2107.02137},
  year={2021}
}