Update README.md
Browse files
README.md
CHANGED
@@ -7,7 +7,7 @@ license: mit
|
|
7 |
|
8 |
## DeBERTa: Decoding-enhanced BERT with Disentangled Attention
|
9 |
|
10 |
-
[DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder.
|
11 |
|
12 |
Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates.
|
13 |
|
|
|
7 |
|
8 |
## DeBERTa: Decoding-enhanced BERT with Disentangled Attention
|
9 |
|
10 |
+
[DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data.
|
11 |
|
12 |
Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates.
|
13 |
|