roygan's picture
Update README.md
8ea5531
|
raw
history blame
No virus
2.86 kB
metadata
language:
  - zh
license: apache-2.0
tags:
  - ZEN
  - chinese
inference: false

Erlangshen-ZEN2-668M-Chinese, one model of Fengshenbang-LM.

Erlangshen-ZEN2-668M-Chinese is an open-source Chinese pre-training model of the ZEN team on the Fengshenbang-LM. IDEA-CCNL refers to the source code of ZEN2.0 and the paper of ZEN2.0, and provides the Chinese classification task and extraction task of ZEN2.0 effects and code samples. In the future, we will work with the ZEN team to explore the optimization direction of the pre-training model and continue to improve the effect of the pre-training model on classification and extraction tasks.

Usage

There is no structure of ZEN2 in Transformers, you can run follow code to get structure of ZEN2 from Fengshenbang-LM

git clone https://github.com/IDEA-CCNL/Fengshenbang-LM.git

load model


from fengshen.models.zen2.ngram_utils import ZenNgramDict
from fengshen.models.zen2.tokenization import BertTokenizer
from fengshen.models.zen2.modeling import ZenModel

pretrain_path = 'IDEA-CCNL/Erlangshen-ZEN2-668M-Chinese'

tokenizer = BertTokenizer.from_pretrained(pretrain_path)
model = ZenForSequenceClassification.from_pretrained(pretrain_path)
# model = ZenForTokenClassification.from_pretrained(pretrain_path)
ngram_dict = ZenNgramDict.from_pretrained(pretrain_path, tokenizer=tokenizer)

You can get classification and extraction examples below.

classification example on fengshen

extraction example on fengshen

Evaluation

Classification

Model(Acc) afqmc tnews iflytek ocnli cmnli
Erlangshen-ZEN2-345M-Chinese 0.741 0.584 0.599 0.788 0.80
Erlangshen-ZEN2-668M-Chinese 0.75 0.60 0.589 0.81 0.82

Extraction

Model(F1) WEIBO(test) Resume(test) MSRA(test) OntoNote4.0(test) CMeEE(dev) CLUENER(dev)
Erlangshen-ZEN2-345M-Chinese 65.26 96.03 95.15 78.93 62.81 79.27
Erlangshen-ZEN2-668M-Chinese 70.02 96.08 95.13 80.89 63.37 79.22

Citation

If you find the resource is useful, please cite the following website in your paper.

@article{Sinovation2021ZEN2,
  title="{ZEN 2.0: Continue Training and Adaption for N-gram Enhanced Text Encoders}",
  author={Yan Song, Tong Zhang, Yonggang Wang, Kai-Fu Lee},
  journal={arXiv preprint arXiv:2105.01279},
  year={2021},
}