Update README.md
Browse files
README.md
CHANGED
@@ -3,52 +3,42 @@ license: mit
|
|
3 |
tags:
|
4 |
- generated_from_trainer
|
5 |
model-index:
|
6 |
-
- name:
|
7 |
results: []
|
8 |
---
|
9 |
|
10 |
-
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
11 |
-
should probably proofread and complete it, then remove this comment. -->
|
12 |
|
13 |
-
#
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
14 |
|
15 |
-
This model is a fine-tuned version of [xlm-roberta-base](https://huggingface.co/xlm-roberta-base) on an unknown dataset.
|
16 |
-
It achieves the following results on the evaluation set:
|
17 |
-
- Loss: 1.4282
|
18 |
|
19 |
-
## Model description
|
20 |
-
|
21 |
-
More information needed
|
22 |
-
|
23 |
-
## Intended uses & limitations
|
24 |
-
|
25 |
-
More information needed
|
26 |
-
|
27 |
-
## Training and evaluation data
|
28 |
-
|
29 |
-
More information needed
|
30 |
-
|
31 |
-
## Training procedure
|
32 |
-
|
33 |
-
### Training hyperparameters
|
34 |
-
|
35 |
-
The following hyperparameters were used during training:
|
36 |
-
- learning_rate: 5e-05
|
37 |
-
- train_batch_size: 10
|
38 |
-
- eval_batch_size: 8
|
39 |
-
- seed: 42
|
40 |
-
- distributed_type: multi-GPU
|
41 |
-
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
42 |
-
- lr_scheduler_type: linear
|
43 |
-
- num_epochs: 3.0
|
44 |
-
|
45 |
-
### Training results
|
46 |
-
|
47 |
-
|
48 |
-
|
49 |
-
### Framework versions
|
50 |
-
|
51 |
-
- Transformers 4.13.0
|
52 |
-
- Pytorch 1.7.1+cu110
|
53 |
-
- Datasets 1.16.1
|
54 |
-
- Tokenizers 0.10.3
|
|
|
3 |
tags:
|
4 |
- generated_from_trainer
|
5 |
model-index:
|
6 |
+
- name: afro-xlmr-base
|
7 |
results: []
|
8 |
---
|
9 |
|
|
|
|
|
10 |
|
11 |
+
# afro-xlmr-base
|
12 |
+
|
13 |
+
AfroXLMR-base was created by MLM adaptation of XLM-R-base model on 17 African languages (Afrikaans, Amharic, Hausa, Igbo, Malagasy, Chichewa, Oromo, Naija, Kinyarwanda, Kirundi, Shona, Somali, Sesotho, Swahili, isiXhosa, Yoruba, and isiZulu) covering the major African language families and 3 high resource languages (Arabic, French, and English).
|
14 |
+
|
15 |
+
## Eval results on MasakhaNER (F-score)
|
16 |
+
language| XLM-R-miniLM| XLM-R-base |XLM-R-large| afro-xlmr-base | afro-xlmr-small | afro-xlmr-mini
|
17 |
+
-|-|-|-|-
|
18 |
+
amh |69.5|70.6|76.2|76.1|70.1|69.7
|
19 |
+
hau |74.5|89.5|90.5|91.2|91.4|87.7
|
20 |
+
ibo |81.9|84.8|84.1|87.4|86.6|83.5
|
21 |
+
kin |68.6|73.3|73.8|78.0|77.5|74.1
|
22 |
+
lug |64.7|79.7|81.6|82.9|83.2|77.4
|
23 |
+
luo |11.7|74.9|73.6|75.1|75.4|17.5
|
24 |
+
pcm |83.2|87.3|89.0|89.6|89.0|85.5
|
25 |
+
swa |86.3|87.4|89.4|88.6|88.7|86.0
|
26 |
+
wol |51.7|63.9|67.9|67.4|65.9|59.0
|
27 |
+
yor |72.0|78.3|78.9|82.1|81.3|75.1
|
28 |
+
|
29 |
+
### BibTeX entry and citation info
|
30 |
+
```
|
31 |
+
@misc{afro_maft,
|
32 |
+
doi = {10.48550/ARXIV.2204.06487},
|
33 |
+
url = {https://arxiv.org/abs/2204.06487},
|
34 |
+
author = {Alabi, Jesujoba O. and Adelani, David Ifeoluwa and Mosbach, Marius and Klakow, Dietrich},
|
35 |
+
keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences},
|
36 |
+
title = {Multilingual Language Model Adaptive Fine-Tuning: A Study on African Languages},
|
37 |
+
publisher = {arXiv},
|
38 |
+
year = {2022},
|
39 |
+
copyright = {Creative Commons Attribution 4.0 International}
|
40 |
+
}
|
41 |
+
|
42 |
+
```
|
43 |
|
|
|
|
|
|
|
44 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|