language: | |
- multilingual | |
datasets: | |
- squad | |
# Multilingual BERT fine-tuned on SQuADv1.1 | |
## Training Arguments | |
```python | |
max_seq_length = 512 | |
doc_stride = 256 | |
max_answer_length = 64 | |
bacth_size = 16 | |
gradient_accumulation_steps = 2 | |
learning_rate = 5e-5 | |
weight_decay = 3e-7 | |
num_train_epochs = 3 | |
warmup_ratio = 0.1 | |
fp16 = True | |
fp16_opt_level = "O1" | |
seed = 0 | |
``` | |
## Results | |
| EM | F1 | | |
| :---: | :---: | | |
| 81.731 | 89.009 | | |