aubmindlab commited on
Commit
3ab904f
1 Parent(s): ac7d6f6

Added model files

Browse files
Readme.md ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: ar
3
+
4
+ widget:
5
+ - text: "وإذا كان هناك من لا يزال يعتقد أن لبنان هو سويسرا الشرق ، فهو مخطئ إلى حد بعيد . فلبنان ليس سويسرا ، ولا يمكن أن يكون كذلك . لقد عاش اللبنانيون في هذا البلد منذ ما يزيد عن ألف وخمسمئة عام ، أي منذ تأسيس الإمارة الشهابية التي أسسها الأمير فخر الدين المعني الثاني ( 1697 - 1742 )"
6
+ ---
7
+
8
+ # AraGPT2 Detector
9
+
10
+ Machine generated detector model from the [AraGPT2: Pre-Trained Transformer for Arabic Language Generation paper](https://arxiv.org/abs/2012.15520)
11
+
12
+ This model is trained on the long text passages, and achieves a 99.4% F1-Score.
13
+
14
+ # How to use it:
15
+ ```python
16
+ from transformers import pipeline
17
+ from arabert.preprocess import ArabertPreprocessor
18
+
19
+ processor = ArabertPreprocessor(model="aubmindlab/araelectra-base-discriminator")
20
+ pipe = pipeline("sentiment-analysis", model = "aubmindlab/aragpt2-mega-detector-long")
21
+
22
+ text = " "
23
+ text_prep = processor.preprocess(text)
24
+ result = pipe(text_prep)
25
+ # [{'label': 'machine-generated', 'score': 0.9977743625640869}]
26
+ ```
27
+
28
+
29
+ # If you used this model please cite us as :
30
+ ```
31
+ @misc{antoun2020aragpt2,
32
+ title={AraGPT2: Pre-Trained Transformer for Arabic Language Generation},
33
+ author={Wissam Antoun and Fady Baly and Hazem Hajj},
34
+ year={2020},
35
+ eprint={2012.15520},
36
+ archivePrefix={arXiv},
37
+ primaryClass={cs.CL}
38
+ }
39
+ ```
40
+
41
+ # Contacts
42
+ **Wissam Antoun**: [Linkedin](https://www.linkedin.com/in/wissam-antoun-622142b4/) | [Twitter](https://twitter.com/wissam_antoun) | [Github](https://github.com/WissamAntoun) | <wfa07@mail.aub.edu> | <wissam.antoun@gmail.com>
43
+
44
+ **Fady Baly**: [Linkedin](https://www.linkedin.com/in/fadybaly/) | [Twitter](https://twitter.com/fadybaly) | [Github](https://github.com/fadybaly) | <fgb06@mail.aub.edu> | <baly.fady@gmail.com>
config.json ADDED
@@ -0,0 +1,36 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/aragpt2-mega-detector-long",
3
+ "architectures": [
4
+ "ElectraForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "embedding_size": 768,
8
+ "generator_hidden_size": 0.33333,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 768,
12
+ "id2label": {
13
+ "0": "machine-generated",
14
+ "1": "human-written"
15
+ },
16
+ "initializer_range": 0.02,
17
+ "intermediate_size": 3072,
18
+ "label2id": {
19
+ "human-written": 1,
20
+ "machine-generated": 0
21
+ },
22
+ "layer_norm_eps": 1e-12,
23
+ "max_position_embeddings": 512,
24
+ "model_type": "electra",
25
+ "num_attention_heads": 12,
26
+ "num_hidden_layers": 12,
27
+ "pad_token_id": 0,
28
+ "position_embedding_type": "absolute",
29
+ "summary_activation": "gelu",
30
+ "summary_last_dropout": 0.1,
31
+ "summary_type": "first",
32
+ "summary_use_proj": true,
33
+ "transformers_version": "4.3.3",
34
+ "type_vocab_size": 2,
35
+ "vocab_size": 64000
36
+ }
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b06fa5fe0c111c6a23700a1adc8fa9d088da70a23f2afc088db0d71de6a24737
3
+ size 540870089
special_tokens_map.json ADDED
@@ -0,0 +1 @@
 
1
+ {"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}
tokenizer_config.json ADDED
@@ -0,0 +1 @@
 
1
+ {"do_lower_case": false, "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]", "tokenize_chinese_chars": true, "strip_accents": null, "max_len": 512, "do_basic_tokenize": true, "never_split": ["[بريد]", "[مستخدم]", "[رابط]"], "special_tokens_map_file": null, "name_or_path": "aubmindlab/araelectra-base-discriminator"}
vocab.txt ADDED
The diff for this file is too large to render. See raw diff