|
--- |
|
language: |
|
- zh |
|
|
|
license: apache-2.0 |
|
|
|
tags: |
|
- bert |
|
- deberta |
|
|
|
inference: true |
|
|
|
widget: |
|
- text: "桂林是世界闻名的旅游城市,它有[MASK]江。" |
|
--- |
|
# Erlangshen-DeBERTa-v2-320M-Chinese,one model of [Fengshenbang-LM](https://github.com/IDEA-CCNL/Fengshenbang-LM) |
|
|
|
The 320 million parameter deberta-V2 base model, using 180G Chinese data, 8 A100(80G) training for 7 days,which is a encoder-only transformer structure. Consumed totally 250M samples. |
|
**our model is still training. And we will update our model once a week!** |
|
|
|
## Task Description |
|
|
|
Erlangshen-Deberta-97M-Chinese is pre-trained by bert like mask task from Deberta [paper](https://readpaper.com/paper/3033187248) |
|
|
|
## Usage |
|
|
|
```python |
|
from transformers import AutoModelForMaskedLM, AutoTokenizer, FillMaskPipeline |
|
import torch |
|
|
|
tokenizer=AutoTokenizer.from_pretrained('IDEA-CCNL/Erlangshen-DeBERTa-v2-320M-Chinese', use_fast=False) |
|
model=AutoModelForMaskedLM.from_pretrained('IDEA-CCNL/Erlangshen-DeBERTa-v2-320M-Chinese') |
|
text = '桂林是世界闻名的旅游城市,它有[MASK]江。' |
|
fillmask_pipe = FillMaskPipeline(model, tokenizer, device=0) |
|
print(fillmask_pipe(text, top_k=10)) |
|
``` |
|
|
|
## Finetune |
|
|
|
We present the dev results on some tasks. |
|
|
|
| Model | AFQMC | TNEWS1.1 | IFLYTEK | OCNLI | CMNLI | |
|
| -------------------------------------------------------------------------------------------------------------------------------- | ------ | -------- | |
|
| RoBERTa-base | 0.7406 | 0.575 | 0.6036 | 0.743 | 0.7973 | |
|
| RoBERTa-large | 0.7488 | 0.5879 | 0.6152 | 0.777 | 0.814 | |
|
| [IDEA-CCNL/Erlangshen-DeBERTa-v2-97M-Chinese](https://huggingface.co/IDEA-CCNL/Erlangshen-DeBERTa-v2-186M-Chinese-SentencePiece) | 0.7405 | 0.571 | 0.5977 | 0.752 | 0.7568 | 0.807 | |
|
| **[IDEA-CCNL/Erlangshen-DeBERTa-v2-320M-Chinese](https://huggingface.co/IDEA-CCNL/Erlangshen-DeBERTa-v2-320M-Chinese)** | 0.7498 | 0.5817 | 0.6042 | 0.8022 | 0.8301| |
|
| [Erlangshen-Deberta-XLarge-710M-Chinese](https://huggingface.co/IDEA-CCNL/Erlangshen-DeBERTa-v2-710M-Chinese) | 0.7549|0.5873|0.6177|0.8012|0.8389| |
|
|
|
## Citation |
|
|
|
If you find the resource is useful, please cite the following website in your paper. |
|
|
|
``` |
|
@misc{Fengshenbang-LM, |
|
title={Fengshenbang-LM}, |
|
author={IDEA-CCNL}, |
|
year={2022}, |
|
howpublished={\url{https://github.com/IDEA-CCNL/Fengshenbang-LM}}, |
|
} |
|
``` |
|
|