File size: 1,929 Bytes
d57039d
29c5f2c
 
 
 
d57039d
 
ebd0799
29c5f2c
ebd0799
 
 
 
 
29c5f2c
 
 
 
 
 
 
 
 
478bbf7
29c5f2c
 
 
 
9427dfa
29c5f2c
ebd0799
 
 
 
be8a0a2
ebd0799
29c5f2c
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
---
language: zh
pipeline_tag: fill-mask
tags:
- bert
license: apache-2.0
---
## Chinese Kowledge-enhanced BERT (CKBERT) 

Knowledge-enhanced pre-trained language models (KEPLMs) improve context-aware representations via learning from structured relations in knowledge graphs, and/or linguistic knowledge from syntactic or dependency analysis. Unlike English, there is a lack of high-performing open-source Chinese KEPLMs in the natural language processing (NLP) community to support various language understanding applications.

For Chinese natural language processing, we provide three **Chinese Kowledge-enhanced BERT (CKBERT)** models named **pai-ckbert-bert-zh**, **pai-ckbert-large-zh** and **pai-ckbert-huge-zh**, from our **EMNLP 2022** paper named **Revisiting and Advancing Chinese Natural Language Understanding with Accelerated Heterogeneous Knowledge Pre-training**.

This repository is developed based on the EasyNLP framework: [https://github.com/alibaba/EasyNLP](https://github.com/alibaba/EasyNLP ) 

## Citation
If you find the resource is useful, please cite the following papers in your work.

- For the EasyNLP framework:
```
@article{easynlp, 
title = {EasyNLP: A Comprehensive and Easy-to-use Toolkit for Natural Language Processing},   		
  author = {Wang, Chengyu and Qiu, Minghui and Zhang, Taolin and Liu, Tingting and Li, Lei and Wang, Jianing and Wang, Ming and Huang, Jun and Lin, Wei}, 
  publisher = {arXiv}, 
  url = {https://arxiv.org/abs/2205.00258}, 
  year = {2022} 
} 
```
- For CKBERT:
```
@article{ckbert, 
  title = {Revisiting and Advancing Chinese Natural Language Understanding with Accelerated Heterogeneous Knowledge Pre-training}, 
  author = {Zhang, Taolin and Dong, Junwei and Wang, Jianing and Wang, Chengyu and Wang, An and Liu, Yinghui and Huang, Jun and Li, Yong and He, Xiaofeng}, 	
  publisher = {EMNLP}, 
  url = {https://arxiv.org/abs/2210.05287}, 
  year = {2022} 
} 
```