Fill-Mask
Transformers
PyTorch
Chinese
bert
Inference Endpoints
kirito commited on
Commit
4b129f0
·
1 Parent(s): b9e5dc5

update README.md

Browse files
Files changed (1) hide show
  1. README.md +30 -0
README.md CHANGED
@@ -1,3 +1,33 @@
1
  ---
2
  license: apache-2.0
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
  ---
4
+ ## Chinese DKPLM (Decomposable Knowledge-enhanced Pre-trained Language Model) for the medical domain
5
+ For Chinese natural language processing in specific domains, we provide Chinese DKPLM (Decomposable Knowledge-enhanced Pre-trained Language Model) for the medical domain named pai-dkplm-bert-zh.
6
+
7
+ **[DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding](https://arxiv.org/abs/2112.01047)**
8
+ Taolin Zhang, Chengyu Wang, Nan Hu, Minghui Qiu, Chengguang Tang, Xiaofeng He, Jun Huang
9
+
10
+ This repository is developed based on the EasyNLP framework: [https://github.com/alibaba/EasyNLP ](https://github.com/alibaba/EasyNLP )developed by the Alibaba PAI team.
11
+ ## Citation
12
+ If you find the resource is useful, please cite the following papers in your work.
13
+
14
+ - For the EasyNLP framework:
15
+ ```
16
+ @article{easynlp,
17
+ title = {EasyNLP: A Comprehensive and Easy-to-use Toolkit for Natural Language Processing}, publisher = {arXiv},
18
+ author = {Wang, Chengyu and Qiu, Minghui and Zhang, Taolin and Liu, Tingting and Li, Lei and Wang, Jianing and Wang, Ming and Huang, Jun and Lin, Wei},
19
+ url = {https://arxiv.org/abs/2205.00258},
20
+ year = {2022}
21
+ }
22
+ ```
23
+
24
+ - For DKPLM:
25
+ ```
26
+ @article{dkplm,
27
+ title = {DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding},
28
+ author = {Zhang, Taolin and Wang, Chengyu and Hu, Nan and Qiu, Minghui and Tang, Chengguang and He, Xiaofeng and Huang, Jun},
29
+ url = {https://arxiv.org/abs/2112.01047},
30
+ publisher = {arXiv},
31
+ year = {2021}
32
+ }
33
+ ```