Datasets:
Tasks:
Fill-Mask
Formats:
csv
Sub-tasks:
masked-language-modeling
Size:
1M - 10M
ArXiv:
Tags:
afrolm
active learning
language modeling
research papers
natural language processing
self-active learning
License:
bonadossou
commited on
Commit
•
307d2b5
1
Parent(s):
daaa87d
Update README.md
Browse files
README.md
CHANGED
@@ -84,7 +84,7 @@ tokenizer = XLMRobertaTokenizer.from_pretrained("bonadossou/afrolm_active_learni
|
|
84 |
tokenizer.model_max_length = 256
|
85 |
```
|
86 |
|
87 |
-
`Autotokenizer` class does not successfully load our tokenizer. So we recommend
|
88 |
|
89 |
## Reproducing our result: Training and Evaluation
|
90 |
|
@@ -95,17 +95,23 @@ tokenizer.model_max_length = 256
|
|
95 |
|
96 |
|
97 |
## Citation
|
98 |
-
-
|
99 |
-
|
100 |
-
|
101 |
-
|
102 |
-
|
103 |
-
|
104 |
-
|
105 |
-
|
106 |
-
|
107 |
-
|
108 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
109 |
|
110 |
## Reach out
|
111 |
|
|
|
84 |
tokenizer.model_max_length = 256
|
85 |
```
|
86 |
|
87 |
+
`Autotokenizer` class does not successfully load our tokenizer. So we recommend using directly the `XLMRobertaTokenizer` class. Depending on your task, you will load the according mode of the model. Read the [XLMRoberta Documentation](https://huggingface.co/docs/transformers/model_doc/xlm-roberta)
|
88 |
|
89 |
## Reproducing our result: Training and Evaluation
|
90 |
|
|
|
95 |
|
96 |
|
97 |
## Citation
|
98 |
+
``@inproceedings{dossou-etal-2022-afrolm,
|
99 |
+
title = "{A}fro{LM}: A Self-Active Learning-based Multilingual Pretrained Language Model for 23 {A}frican Languages",
|
100 |
+
author = "Dossou, Bonaventure F. P. and
|
101 |
+
Tonja, Atnafu Lambebo and
|
102 |
+
Yousuf, Oreen and
|
103 |
+
Osei, Salomey and
|
104 |
+
Oppong, Abigail and
|
105 |
+
Shode, Iyanuoluwa and
|
106 |
+
Awoyomi, Oluwabusayo Olufunke and
|
107 |
+
Emezue, Chris",
|
108 |
+
booktitle = "Proceedings of The Third Workshop on Simple and Efficient Natural Language Processing (SustaiNLP)",
|
109 |
+
month = dec,
|
110 |
+
year = "2022",
|
111 |
+
address = "Abu Dhabi, United Arab Emirates (Hybrid)",
|
112 |
+
publisher = "Association for Computational Linguistics",
|
113 |
+
url = "https://aclanthology.org/2022.sustainlp-1.11",
|
114 |
+
pages = "52--64",}``
|
115 |
|
116 |
## Reach out
|
117 |
|