File size: 486 Bytes
fdd9e00
b9dc41d
6cc1436
b9dc41d
 
fdd9e00
 
fe6731e
e5eb12d
69f2cad
fdd9e00
69f2cad
fdd9e00
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
---
language: 
  - en
  - ru
  - multilingual
license: apache-2.0
---
# XLM-RoBERTa large model whole word masking finetuned on SQuAD
Pretrained model using a masked language modeling (MLM) objective. 
Fine tuned on English and Russian QA datasets

## Used QA Datasets
SQuAD + SberQuAD

[SberQuAD original paper](https://arxiv.org/pdf/1912.09723.pdf) is here! Recommend to read!

## Evaluation results
The results obtained are the following (SberQUaD):
```
f1 = 84.3
exact_match = 65.3