wissamantoun commited on
Commit
3a8039d
1 Parent(s): 01d828e

added citation

Browse files
Files changed (1) hide show
  1. README.md +15 -26
README.md CHANGED
@@ -78,34 +78,25 @@ arabert_prep.preprocess(text)
78
 
79
  # TensorFlow 1.x models
80
 
81
- The TF1.x model are available in the HuggingFace models repo.
82
- You can download them as follows:
83
- - via git-lfs: clone all the models in a repo
84
- ```bash
85
- curl -s https://packagecloud.io/install/repositories/github/git-lfs/script.deb.sh | sudo bash
86
- sudo apt-get install git-lfs
87
- git lfs install
88
- git clone https://huggingface.co/aubmindlab/MODEL_NAME
89
- tar -C ./MODEL_NAME -zxvf /content/MODEL_NAME/tf1_model.tar.gz
90
- ```
91
- where `MODEL_NAME` is any model under the `aubmindlab` name
92
-
93
- - via `wget`:
94
- - Go to the tf1_model.tar.gz file on huggingface.co/models/aubmindlab/MODEL_NAME.
95
- - copy the `oid sha256`
96
- - then run `wget https://cdn-lfs.huggingface.co/aubmindlab/aragpt2-base/INSERT_THE_SHA_HERE` (ex: for `aragpt2-base`: `wget https://cdn-lfs.huggingface.co/aubmindlab/aragpt2-base/3766fc03d7c2593ff2fb991d275e96b81b0ecb2098b71ff315611d052ce65248`)
97
 
 
98
 
99
  # If you used this model please cite us as :
100
 
101
  ```
102
- @misc{antoun2020aragpt2,
103
- title={AraGPT2: Pre-Trained Transformer for Arabic Language Generation},
104
- author={Wissam Antoun and Fady Baly and Hazem Hajj},
105
- year={2020},
106
- eprint={2012.15520},
107
- archivePrefix={arXiv},
108
- primaryClass={cs.CL}
 
 
 
 
 
109
  }
110
  ```
111
 
@@ -115,6 +106,4 @@ Thanks to TensorFlow Research Cloud (TFRC) for the free access to Cloud TPUs, co
115
  # Contacts
116
  **Wissam Antoun**: [Linkedin](https://www.linkedin.com/in/wissam-antoun-622142b4/) | [Twitter](https://twitter.com/wissam_antoun) | [Github](https://github.com/WissamAntoun) | <wfa07@mail.aub.edu> | <wissam.antoun@gmail.com>
117
 
118
- **Fady Baly**: [Linkedin](https://www.linkedin.com/in/fadybaly/) | [Twitter](https://twitter.com/fadybaly) | [Github](https://github.com/fadybaly) | <fgb06@mail.aub.edu> | <baly.fady@gmail.com>
119
-
120
-
 
78
 
79
  # TensorFlow 1.x models
80
 
81
+ **You can find the PyTorch, TF2 and TF1 models in HuggingFace's Transformer Library under the ```aubmindlab``` username**
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
82
 
83
+ - `wget https://huggingface.co/aubmindlab/MODEL_NAME/resolve/main/tf1_model.tar.gz` where `MODEL_NAME` is any model under the `aubmindlab` name
84
 
85
  # If you used this model please cite us as :
86
 
87
  ```
88
+ @inproceedings{antoun-etal-2021-araelectra,
89
+ title = "{A}ra{ELECTRA}: Pre-Training Text Discriminators for {A}rabic Language Understanding",
90
+ author = "Antoun, Wissam and
91
+ Baly, Fady and
92
+ Hajj, Hazem",
93
+ booktitle = "Proceedings of the Sixth Arabic Natural Language Processing Workshop",
94
+ month = apr,
95
+ year = "2021",
96
+ address = "Kyiv, Ukraine (Virtual)",
97
+ publisher = "Association for Computational Linguistics",
98
+ url = "https://www.aclweb.org/anthology/2021.wanlp-1.20",
99
+ pages = "191--195",
100
  }
101
  ```
102
 
 
106
  # Contacts
107
  **Wissam Antoun**: [Linkedin](https://www.linkedin.com/in/wissam-antoun-622142b4/) | [Twitter](https://twitter.com/wissam_antoun) | [Github](https://github.com/WissamAntoun) | <wfa07@mail.aub.edu> | <wissam.antoun@gmail.com>
108
 
109
+ **Fady Baly**: [Linkedin](https://www.linkedin.com/in/fadybaly/) | [Twitter](https://twitter.com/fadybaly) | [Github](https://github.com/fadybaly) | <fgb06@mail.aub.edu> | <baly.fady@gmail.com>