elmadany
commited on
Commit
•
352a688
1
Parent(s):
0641745
commit from elmadany
Browse files
README.md
CHANGED
@@ -1,6 +1,6 @@
|
|
1 |
<img src="https://raw.githubusercontent.com/UBC-NLP/marbert/main/ARBERT_MARBERT.jpg" alt="drawing" width="30%" height="30%" align="right"/>
|
2 |
|
3 |
-
**MARBERTv2** is one of three models described in our ACL 2021 paper ["ARBERT & MARBERT: Deep Bidirectional Transformers for Arabic"](https://aclanthology.org/2021.acl-long.551.pdf)
|
4 |
|
5 |
We find that results with ARBERT and MARBERT on QA are not competitive, a clear discrepancy from what we have observed thus far on other tasksWe hypothesize this is because the two models are pre-trained with a sequence length of only 128, which does not allow them to sufficiently capture both a question and its likely answer within the same sequence window during the pre-training.
|
6 |
|
@@ -13,7 +13,7 @@ For more information, please visit our own GitHub [repo](https://github.com/UBC-
|
|
13 |
|
14 |
# BibTex
|
15 |
|
16 |
-
If you use our models (ARBERT, MARBERT, or
|
17 |
```bibtex
|
18 |
@inproceedings{abdul-mageed-etal-2021-arbert,
|
19 |
title = "{ARBERT} {\&} {MARBERT}: Deep Bidirectional Transformers for {A}rabic",
|
|
|
1 |
<img src="https://raw.githubusercontent.com/UBC-NLP/marbert/main/ARBERT_MARBERT.jpg" alt="drawing" width="30%" height="30%" align="right"/>
|
2 |
|
3 |
+
**MARBERTv2** is one of three models described in our **ACL 2021 paper** **["ARBERT & MARBERT: Deep Bidirectional Transformers for Arabic"](https://aclanthology.org/2021.acl-long.551.pdf)**.
|
4 |
|
5 |
We find that results with ARBERT and MARBERT on QA are not competitive, a clear discrepancy from what we have observed thus far on other tasksWe hypothesize this is because the two models are pre-trained with a sequence length of only 128, which does not allow them to sufficiently capture both a question and its likely answer within the same sequence window during the pre-training.
|
6 |
|
|
|
13 |
|
14 |
# BibTex
|
15 |
|
16 |
+
If you use our models (ARBERT, MARBERT, or MARBERTv2) for your scientific publication, or if you find the resources in this repository useful, please cite our paper as follows (to be updated):
|
17 |
```bibtex
|
18 |
@inproceedings{abdul-mageed-etal-2021-arbert,
|
19 |
title = "{ARBERT} {\&} {MARBERT}: Deep Bidirectional Transformers for {A}rabic",
|