DeBERTa commited on
Commit
03c4581
1 Parent(s): 3546ffd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -7,7 +7,7 @@ license: mit
7
 
8
  ## DeBERTa: Decoding-enhanced BERT with Disentangled Attention
9
 
10
- [DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. With those two improvements, DeBERTa out perform RoBERTa on a majority of NLU tasks with 80GB training data.
11
 
12
  Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates.
13
 
 
7
 
8
  ## DeBERTa: Decoding-enhanced BERT with Disentangled Attention
9
 
10
+ [DeBERTa](https://arxiv.org/abs/2006.03654) improves the BERT and RoBERTa models using disentangled attention and enhanced mask decoder. It outperforms BERT and RoBERTa on majority of NLU tasks with 80GB training data.
11
 
12
  Please check the [official repository](https://github.com/microsoft/DeBERTa) for more details and updates.
13