Update README.md
Browse files
README.md
CHANGED
@@ -16,11 +16,21 @@ All models are available on the `HuggingFace` model page under the [Ebtihal](htt
|
|
16 |
|
17 |
## Pretraining Corpus
|
18 |
|
19 |
-
`AraBertMo_base_V1' model was
|
20 |
|
21 |
- Arabic version of [OSCAR](https://traces1.inria.fr/oscar/) -
|
22 |
|
23 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
24 |
|
25 |
## Load Pretrained Model
|
26 |
|
@@ -32,7 +42,7 @@ tokenizer = AutoTokenizer.from_pretrained("Ebtihal/AraBertMo_base_V1")
|
|
32 |
model = AutoModelForMaskedLM.from_pretrained("Ebtihal/AraBertMo_base_V1")
|
33 |
```
|
34 |
|
35 |
-
- This model built for master's degree research in organization:
|
36 |
- [University of kufa](https://uokufa.edu.iq/).
|
37 |
- [Faculty of Computer Science and Mathematics](https://mathcomp.uokufa.edu.iq/).
|
38 |
- **Department of Computer Science**
|
|
|
16 |
|
17 |
## Pretraining Corpus
|
18 |
|
19 |
+
`AraBertMo_base_V1' model was pre-trained on ~3 million words:
|
20 |
|
21 |
- Arabic version of [OSCAR](https://traces1.inria.fr/oscar/) -
|
22 |
|
23 |
|
24 |
+
## Training results
|
25 |
+
|
26 |
+
this model achieves the following results:
|
27 |
+
|
28 |
+
| Task | Num examples | Num Epochs | Batch Size | steps | Wall time | training loss|
|
29 |
+
|:----:|:----:|:----:|:----:|:-----:|:----:|:-----:|
|
30 |
+
| Fill-Mask| 10010| 1 | 64 | 157 | 2m 2s | 9.0183 |
|
31 |
+
|
32 |
+
### BibTeX entry and citation info
|
33 |
+
|
34 |
|
35 |
## Load Pretrained Model
|
36 |
|
|
|
42 |
model = AutoModelForMaskedLM.from_pretrained("Ebtihal/AraBertMo_base_V1")
|
43 |
```
|
44 |
|
45 |
+
- This model was built for master's degree research in an organization:
|
46 |
- [University of kufa](https://uokufa.edu.iq/).
|
47 |
- [Faculty of Computer Science and Mathematics](https://mathcomp.uokufa.edu.iq/).
|
48 |
- **Department of Computer Science**
|