AndrewYan commited on
Commit
29c5f2c
1 Parent(s): d57039d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +35 -0
README.md CHANGED
@@ -1,3 +1,38 @@
1
  ---
 
 
 
 
 
 
 
2
  license: apache-2.0
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ language: zh
3
+ pipeline_tag: fill-mask
4
+ widget:
5
+ - text: "感冒需要吃[MASK]"
6
+ - text: "人类的[MASK]温是37度"
7
+ tags:
8
+ - bert
9
  license: apache-2.0
10
  ---
11
+ ## Chinese DKPLM (Decomposable Knowledge-enhanced Pre-trained Language Model) for the medical domain
12
+ For Chinese natural language processing in specific domains, we provide **Chinese DKPLM (Decomposable Knowledge-enhanced Pre-trained Language Model)** for the medical domain named **pai-dkplm-bert-zh**, from our AAAI 2021 paper named **DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding**.
13
+
14
+ This repository is developed based on the EasyNLP framework: [https://github.com/alibaba/EasyNLP](https://github.com/alibaba/EasyNLP ) developed by the Alibaba PAI team. Please find the DKPLM tutorial here: [DKPLM Tutorial](https://github.com/alibaba/EasyNLP/tree/master/examples/dkplm_pretraining).
15
+
16
+ ## Citation
17
+ If you find the resource is useful, please cite the following papers in your work.
18
+
19
+ - For the EasyNLP framework:
20
+ ```
21
+ @article{easynlp,
22
+ title = {EasyNLP: A Comprehensive and Easy-to-use Toolkit for Natural Language Processing},
23
+ publisher = {arXiv},
24
+ author = {Wang, Chengyu and Qiu, Minghui and Zhang, Taolin and Liu, Tingting and Li, Lei and Wang, Jianing and Wang, Ming and Huang, Jun and Lin, Wei},
25
+ url = {https://arxiv.org/abs/2205.00258},
26
+ year = {2022}
27
+ }
28
+ ```
29
+ - For DKPLM:
30
+ ```
31
+ @article{dkplm,
32
+ title = {DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding},
33
+ author = {Zhang, Taolin and Wang, Chengyu and Hu, Nan and Qiu, Minghui and Tang, Chengguang and He, Xiaofeng and Huang, Jun},
34
+ url = {https://arxiv.org/abs/2112.01047},
35
+ publisher = {AAAI},
36
+ year = {2021}
37
+ }
38
+ ```