sijunhe commited on
Commit
baae56b
•
1 Parent(s): 6c07fac

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +35 -5
README.md CHANGED
@@ -1,9 +1,39 @@
1
  ---
 
2
  license: apache-2.0
3
  language:
4
- - zh
5
- library_name: paddlenlp
6
- tags:
7
- - fill-mask
8
- mask_token: "[MASK]"
9
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ library_name: paddlenlp
3
  license: apache-2.0
4
  language:
5
+ - zh
 
 
 
 
6
  ---
7
+ # PaddlePaddle/ernie-1.0-base-zh
8
+
9
+ ## Introduction
10
+
11
+ We present a novel language representation model enhanced by knowledge called ERNIE (Enhanced Representation through kNowledge IntEgration).
12
+ Inspired by the masking strategy of BERT, ERNIE is designed to learn language representation enhanced by knowledge masking strategies,
13
+ which includes entity-level masking and phrase-level masking. Entity-level strategy masks entities which are usually composed of multiple words.
14
+ Phrase-level strategy masks the whole phrase which is composed of several words standing together as a conceptual unit.
15
+ Experimental results show that ERNIE outperforms other baseline methods, achieving new state-of-the-art results on five Chinese natural language processing tasks
16
+ including natural language inference, semantic similarity, named entity recognition, sentiment analysis and question answering.
17
+ We also demonstrate that ERNIE has more powerful knowledge inference capacity on a cloze test.
18
+
19
+ More detail: https://arxiv.org/abs/1904.09223
20
+
21
+ ## Available Models
22
+
23
+ - ernie-1.0-base-zh
24
+ - ernie-1.0-large-zh-cw
25
+
26
+ ## How to Use?
27
+
28
+ Click on the *Use in paddlenlp* button on the top right!
29
+
30
+ ## Citation Info
31
+
32
+ ```text
33
+ @article{ernie2.0,
34
+ title = {ERNIE: Enhanced Representation through Knowledge Integration},
35
+ author = {Sun, Yu and Wang, Shuohuan and Li, Yukun and Feng, Shikun and Chen, Xuyi and Zhang, Han and Tian, Xin and Zhu, Danxiang and Tian, Hao and Wu, Hua},
36
+ journal={arXiv preprint arXiv:1904.09223},
37
+ year = {2019},
38
+ }
39
+ ```