julien-c HF staff commited on
Commit
bcfa393
•
1 Parent(s): 7391e85

Migrate model card from transformers-repo

Browse files

Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/nghuyong/ernie-1.0/README.md

Files changed (1) hide show
  1. README.md +44 -0
README.md ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: zh
3
+ ---
4
+
5
+ # ERNIE-1.0
6
+
7
+ ## Introduction
8
+
9
+ ERNIE (Enhanced Representation through kNowledge IntEgration) is proposed by Baidu in 2019,
10
+ which is designed to learn language representation enhanced by knowledge masking strategies i.e. entity-level masking and phrase-level masking.
11
+ Experimental results show that ERNIE achieve state-of-the-art results on five Chinese natural language processing tasks including natural language inference,
12
+ semantic similarity, named entity recognition, sentiment analysis and question answering.
13
+
14
+ More detail: https://arxiv.org/abs/1904.09223
15
+
16
+ ## Released Model Info
17
+
18
+ |Model Name|Language|Model Structure|
19
+ |:---:|:---:|:---:|
20
+ |ernie-1.0| Chinese |Layer:12, Hidden:768, Heads:12|
21
+
22
+ This released pytorch model is converted from the officially released PaddlePaddle ERNIE model and
23
+ a series of experiments have been conducted to check the accuracy of the conversion.
24
+
25
+ - Official PaddlePaddle ERNIE repo: https://github.com/PaddlePaddle/ERNIE
26
+ - Pytorch Conversion repo: https://github.com/nghuyong/ERNIE-Pytorch
27
+
28
+ ## How to use
29
+ ```Python
30
+ from transformers import AutoTokenizer, AutoModel
31
+ tokenizer = AutoTokenizer.from_pretrained("nghuyong/ernie-1.0")
32
+ model = AutoModel.from_pretrained("nghuyong/ernie-1.0")
33
+ ```
34
+
35
+ ## Citation
36
+
37
+ ```bibtex
38
+ @article{sun2019ernie,
39
+ title={Ernie: Enhanced representation through knowledge integration},
40
+ author={Sun, Yu and Wang, Shuohuan and Li, Yukun and Feng, Shikun and Chen, Xuyi and Zhang, Han and Tian, Xin and Zhu, Danxiang and Tian, Hao and Wu, Hua},
41
+ journal={arXiv preprint arXiv:1904.09223},
42
+ year={2019}
43
+ }
44
+ ```