julien-c HF staff commited on
Commit
54eadd6
1 Parent(s): cdfb5ea

Migrate model card from transformers-repo

Browse files

Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/microsoft/deberta-large/README.md

Files changed (1) hide show
  1. README.md +37 -0
README.md ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ thumbnail: https://huggingface.co/front/thumbnails/microsoft.png
3
+ license: mit
4
+ ---
5
+
6
+ ## DeBERTa: Decoding-enhanced BERT with Disentangled Attention
7
+
8
+ [DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. With those two improvements, DeBERTa out perform RoBERTa on a majority of NLU tasks with 80GB training data.
9
+
10
+ Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates.
11
+
12
+
13
+ #### Fine-tuning on NLU tasks
14
+
15
+ We present the dev results on SQuAD 1.1/2.0 and several GLUE benchmark tasks.
16
+
17
+ | Model | SQuAD 1.1 | SQuAD 2.0 | MNLI-m | SST-2 | QNLI | CoLA | RTE | MRPC | QQP |STS-B|
18
+ |-------------------|-----------|-----------|--------|-------|------|------|------|------|------|-----|
19
+ | BERT-Large | 90.9/84.1 | 81.8/79.0 | 86.6 | 93.2 | 92.3 | 60.6 | 70.4 | 88.0 | 91.3 |90.0 |
20
+ | RoBERTa-Large | 94.6/88.9 | 89.4/86.5 | 90.2 | 96.4 | 93.9 | 68.0 | 86.6 | 90.9 | 92.2 |92.4 |
21
+ | XLNet-Large | 95.1/89.7 | 90.6/87.9 | 90.8 | 97.0 | 94.9 | 69.0 | 85.9 | 90.8 | 92.3 |92.5 |
22
+ | **DeBERTa-Large** | 95.5/90.1 | 90.7/88.0 | 91.1 | 96.5 | 95.3 | 69.5 | 88.1 | 92.5 | 92.3 |92.5 |
23
+
24
+ ### Citation
25
+
26
+ If you find DeBERTa useful for your work, please cite the following paper:
27
+
28
+ ``` latex
29
+ @misc{he2020deberta,
30
+ title={DeBERTa: Decoding-enhanced BERT with Disentangled Attention},
31
+ author={Pengcheng He and Xiaodong Liu and Jianfeng Gao and Weizhu Chen},
32
+ year={2020},
33
+ eprint={2006.03654},
34
+ archivePrefix={arXiv},
35
+ primaryClass={cs.CL}
36
+ }
37
+ ```