AndrewYan commited on
Commit
ebd0799
•
1 Parent(s): c07fe68

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -9
README.md CHANGED
@@ -5,10 +5,13 @@ tags:
5
  - bert
6
  license: apache-2.0
7
  ---
8
- ## Chinese DKPLM (Decomposable Knowledge-enhanced Pre-trained Language Model) for the medical domain
9
- For Chinese natural language processing in specific domains, we provide **Chinese DKPLM (Decomposable Knowledge-enhanced Pre-trained Language Model)** for the medical domain named **pai-dkplm-bert-zh**, from our AAAI 2021 paper named **DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding**.
10
 
11
- This repository is developed based on the EasyNLP framework: [https://github.com/alibaba/EasyNLP](https://github.com/alibaba/EasyNLP ) developed by the Alibaba PAI team. Please find the DKPLM tutorial here: [DKPLM Tutorial](https://github.com/alibaba/EasyNLP/tree/master/examples/dkplm_pretraining).
 
 
 
 
12
 
13
  ## Citation
14
  If you find the resource is useful, please cite the following papers in your work.
@@ -25,11 +28,10 @@ title = {EasyNLP: A Comprehensive and Easy-to-use Toolkit for Natural Language P
25
  ```
26
  - For DKPLM:
27
  ```
28
- @article{dkplm,
29
- title = {DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding},
30
- author = {Zhang, Taolin and Wang, Chengyu and Hu, Nan and Qiu, Minghui and Tang, Chengguang and He, Xiaofeng and Huang, Jun},
31
- url = {https://arxiv.org/abs/2112.01047},
32
- publisher = {AAAI},
33
- year = {2021}
34
  }
35
  ```
 
5
  - bert
6
  license: apache-2.0
7
  ---
8
+ ## Chinese Kowledge-enhanced BERT (CKBERT)
 
9
 
10
+ Knowledge-enhanced pre-trained language models (KEPLMs) improve context-aware representations via learning from structured relations in knowledge graphs, and/or linguistic knowledge from syntactic or dependency analysis. Unlike English, there is a lack of high-performing open-source Chinese KEPLMs in the natural language processing (NLP) community to support various language understanding applications.
11
+
12
+ For Chinese natural language processing, we provide three **Chinese Kowledge-enhanced BERT (CKBERT)** models named **pai-ckbert-bert-zh**, **pai-ckbert-large-zh** and **pai-ckbert-huge-zh**, from our **EMNLP 2022** paper named **Revisiting and Advancing Chinese Natural Language Understanding with Accelerated Heterogeneous Knowledge Pre-training**.
13
+
14
+ This repository is developed based on the EasyNLP framework: [https://github.com/alibaba/EasyNLP](https://github.com/alibaba/EasyNLP )
15
 
16
  ## Citation
17
  If you find the resource is useful, please cite the following papers in your work.
 
28
  ```
29
  - For DKPLM:
30
  ```
31
+ @article{ckbert,
32
+ title = {Revisiting and Advancing Chinese Natural Language Understanding with Accelerated Heterogeneous Knowledge Pre-training},
33
+ author = {Zhang, Taolin and Dong, Junwei and Wang, Jianing and Wang, Chengyu and Wang, An and Liu, Yinghui and Huang, Jun and Li, Yong and He, Xiaofeng},
34
+ publisher = {EMNLP},
35
+ year = {2022}
 
36
  }
37
  ```