Text2Text Generation
Transformers
PyTorch
TensorFlow
Arabic
t5
Arabic T5
MSA
Twitter
Arabic Dialect
Arabic Machine Translation
Arabic Text Summarization
Arabic News Title and Question Generation
Arabic Paraphrasing and Transliteration
Arabic Code-Switched Translation
Inference Endpoints
text-generation-inference
Update README.md
Browse files
README.md
CHANGED
@@ -67,17 +67,15 @@ AraT5 Pytorch and TensorFlow checkpoints are available on the Huggingface websit
|
|
67 |
|
68 |
If you use our models (Arat5-base, Arat5-msa-base, Arat5-tweet-base, Arat5-msa-small, or Arat5-tweet-small ) for your scientific publication, or if you find the resources in this repository useful, please cite our paper as follows (to be updated):
|
69 |
```bibtex
|
70 |
-
@inproceedings{
|
71 |
-
|
72 |
-
|
73 |
-
|
74 |
-
|
75 |
-
|
76 |
-
|
77 |
-
|
78 |
-
|
79 |
-
publisher = "Association for Computational Linguistics",
|
80 |
-
}
|
81 |
```
|
82 |
|
83 |
## Acknowledgments
|
|
|
67 |
|
68 |
If you use our models (Arat5-base, Arat5-msa-base, Arat5-tweet-base, Arat5-msa-small, or Arat5-tweet-small ) for your scientific publication, or if you find the resources in this repository useful, please cite our paper as follows (to be updated):
|
69 |
```bibtex
|
70 |
+
@inproceedings{nagoudi2022_arat5,
|
71 |
+
title={AraT5: Text-to-Text Transformers for Arabic Language Generation},
|
72 |
+
author={Nagoudi, El Moatez Billah and Elmadany, AbdelRahim and Abdul-Mageed, Muhammad},
|
73 |
+
journal={Proceedings of the 60th Annual Meeting of the Association for Computational Linguistic},
|
74 |
+
month = {May},
|
75 |
+
address = {Online},
|
76 |
+
year={2022},
|
77 |
+
publisher = {Association for Computational Linguistics}
|
78 |
+
}
|
|
|
|
|
79 |
```
|
80 |
|
81 |
## Acknowledgments
|