julien-c HF staff commited on
Commit
6ea8f1d
1 Parent(s): 0148e06

Migrate model card from transformers-repo

Browse files

Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/nghuyong/ernie-2.0-en/README.md

Files changed (1) hide show
  1. README.md +43 -0
README.md ADDED
@@ -0,0 +1,43 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: en
3
+ ---
4
+
5
+ # ERNIE-2.0
6
+
7
+ ## Introduction
8
+
9
+ ERNIE 2.0 is a continual pre-training framework proposed by Baidu in 2019,
10
+ which builds and learns incrementally pre-training tasks through constant multi-task learning.
11
+ Experimental results demonstrate that ERNIE 2.0 outperforms BERT and XLNet on 16 tasks including English tasks on GLUE benchmarks and several common tasks in Chinese.
12
+
13
+ More detail: https://arxiv.org/abs/1907.12412
14
+
15
+ ## Released Model Info
16
+
17
+ |Model Name|Language|Model Structure|
18
+ |:---:|:---:|:---:|
19
+ |ernie-2.0-en| English |Layer:12, Hidden:768, Heads:12|
20
+
21
+ This released pytorch model is converted from the officially released PaddlePaddle ERNIE model and
22
+ a series of experiments have been conducted to check the accuracy of the conversion.
23
+
24
+ - Official PaddlePaddle ERNIE repo: https://github.com/PaddlePaddle/ERNIE
25
+ - Pytorch Conversion repo: https://github.com/nghuyong/ERNIE-Pytorch
26
+
27
+ ## How to use
28
+ ```Python
29
+ from transformers import AutoTokenizer, AutoModel
30
+ tokenizer = AutoTokenizer.from_pretrained("nghuyong/ernie-2.0-en")
31
+ model = AutoModel.from_pretrained("nghuyong/ernie-2.0-en")
32
+ ```
33
+
34
+ ## Citation
35
+
36
+ ```bibtex
37
+ @article{sun2019ernie20,
38
+ title={ERNIE 2.0: A Continual Pre-training Framework for Language Understanding},
39
+ author={Sun, Yu and Wang, Shuohuan and Li, Yukun and Feng, Shikun and Tian, Hao and Wu, Hua and Wang, Haifeng},
40
+ journal={arXiv preprint arXiv:1907.12412},
41
+ year={2019}
42
+ }
43
+ ```